In Spark SQL I could not find a function that adds days to a timestamp and receive a timestamp back, and I can use a calculated value to do so.
This works, but static
SELECT col1 interval 2 days FROM
VALUES make_timestamp(2021, 12, 28, 6, 30, 45.887)
I need one that I can calculate. This doesn't work (I cannot fill it with column value either):
SELECT col1 interval (5-3) days
FROM VALUES make_timestamp(2021, 12, 28, 6, 30, 45.887)
This is not good either (as it gives a date back):
SELECT date_add(col1,1) FROM
VALUES make_timestamp(2021, 12, 28, 6, 30, 45.887)
CodePudding user response:
One way is to convert timestamp to unix time and add desired number of seconds.
scala> spark.sql("select from_unixtime(unix_timestamp('2021-10-15 10:11:12') 10 * 24*60*60) plus10days").show(false)
-------------------
|plus10days |
-------------------
|2021-10-25 10:11:12|
-------------------
CodePudding user response:
Instead of using directly interval constructor, you can build a string representing your interval with concat
SQL function and then cast it to an interval with cast
SQL function:
SELECT col1 cast(concat(5-3, ' days') as interval)
FROM VALUES make_timestamp(2021, 12, 28, 6, 30, 45.887)
which returns the following result:
----------------------------------------------------------------------------------
|CAST(col1 CAST(concat(CAST((5 - 3) AS STRING), days) AS INTERVAL) AS TIMESTAMP)|
----------------------------------------------------------------------------------
|2021-12-30 06:30:45.887 |
----------------------------------------------------------------------------------