Home > Back-end >  PySpark get minute only?
PySpark get minute only?

Time:12-08

In snowflake, you can do something like:

SELECT
my_event_time,
DATE_TRUNC('minute',my_event_time)::TIME AS minute
FROM table

And it would return something like:

my_event_time             | minute
-------------------------------------
2020-08-17 13:23:49.227.  | 13:23:00

Removing everything except the actual minute, can this be done in Pyspark df? The date_trunc('minute', ...) in Pyspark does something else, it doesnt remove the date part.

CodePudding user response:

Use date_format function and pass required time format.

spark.sql("select date_format(current_timestamp,'HH:mm:ss') time").show()

 -------- 
|    time|
 -------- 
|10:48:13|
 -------- 

CodePudding user response:

Try this

spark.sql("current_timestamp,minute(current_timestamp)").show()
  • Related