Home > other >  Scala/Spark Change Datetime into EpochTime format
Scala/Spark Change Datetime into EpochTime format

Time:11-12

I'm trying to change the format of a column datetime into EpochTime but all I could find about this subject is Java.Time.Instant or getEpochSecond and all the functions and everything related to java, but i'm working with scala and I need to change datetime format into EpochTime.

Sample of my datetime column:

2013-12-31T05:14:22
2013-12-31T16:49:31
2013-12-30T18:29:20
2013-12-30T21:02:29

The format of the datetime I used:

"yyyy-MMM-dd-HH:mm:ss"

Code that I tried but gave an error of can't resolve symbol getEpochSecond :

 val EpochTime: Long=dataDB.select("Date").getEpochSecond

CodePudding user response:

Assuming that your timstamps are in GMT, the below code should work

val inputDF = Seq("2013-12-31T05:14:22","2013-12-31T16:49:31", "2013-12-30T18:29:20", "2013-12-30T21:02:29")
    .toDF("timestamp")

import org.apache.spark.sql.functions._

val newDF = inputDF.withColumn("epoch", unix_timestamp(col("timestamp"), "yyyy-MM-dd'T'HH:mm:ss")) // change this timestamp format to offset date

newDF.show(false)

If you have timestamps in a particular timezone you can use Zulu value to offset that or specifu timezone explicitly see details here Java SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss'Z'") gives timezone as IST

  • Related