Home > OS >  String to Date in AWS Glue with Data Frames
String to Date in AWS Glue with Data Frames

Time:09-28

I'm trying to convert/cast a column within a data frame from string to date with no success, here is part of the code:

from pyspark.sql.functions import from_unixtime, unix_timestamp, col
from datetime import datetime

## Dynamyc Frame to Data Frame
df = Transform0.toDF()

## Substring of time column
## Before: "Thu Sep 03 2020 01:43:52 GMT 0000 (Coordinated Universal Time)""
df = df.withColumn('date_str', substring(df['time'],5,20))
## After: "Sep 03 2020 01:43:52"

## I have tried the following statements with no success
## I use show() in order to see in logs the result

df.withColumn('date_str', datetime.strptime('date_str', '%b %d %Y %H:%M:%S')).show()
df.withColumn(col('date_str'), from_unixtime(unix_timestamp(col('date_str'),"%b %d %Y %H:%M:%S"))).show()
df.withColumn('date_str', to_timestamp('date_str', '%b %d %Y %H:%M:%S')).show()

CodePudding user response:

You are supposed to assign it to another data frame variable .

eg:

df = df.withColumn(column, from_unixtime(unix_timestamp(col('date_str'), 'yyyy/MM/dd hh:mm:ss')).cast(
                    types.TimestampType()))
df.show()

CodePudding user response:

Try using spark datetime formats while using spark functions to_timestamp()...etc functions.

Example:

 df.show()
# -------------------- 
#|                  ts|
# -------------------- 
#|Sep 03 2020 01:43:52|
# -------------------- 

df.withColumn("ts1",to_timestamp(col("ts"),"MMM dd yyyy hh:mm:ss")).show()
# -------------------- ------------------- 
#|                  ts|                ts1|
# -------------------- ------------------- 
#|Sep 03 2020 01:43:52|2020-09-03 01:43:52|
# -------------------- ------------------- 
  • Related