I want to select a portion of my dataframe with dates containing 2022 up to latest date and that may include (today and tomorrow and next). How can I achieve that?
df= df.filter(col("sales_date").contains("2022"))
CodePudding user response:
You can use between function or even '>'
df= df.filter(col("date").between("2022-01-01", "2022-12-31"))
or
df= df.filter(col("date") > "2022-01-01")
CodePudding user response:
As mentioned about, 'between' syntax will do the trick, just make sure your column is converted in a proper format: https://sparkbyexamples.com/spark/spark-convert-string-to-timestamp-format/
CodePudding user response:
you can use like in filter where in %
works as wild card char.
scala> var df = Seq(("2022-01-01"),("2021-02-01")).toDF
df: org.apache.spark.sql.DataFrame = [value: string]
scala> df = df.withColumn("date",col("value").cast("date"))
df: org.apache.spark.sql.DataFrame = [value: string, date: date]
scala> df.printSchema
root
|-- value: string (nullable = true)
|-- date: date (nullable = true)
scala> df.show()
---------- ----------
| value| date|
---------- ----------
|2022-01-01|2022-01-01|
|2021-02-01|2021-02-01|
---------- ----------
scala> df.filter(col("date").like("2022%")).show()
---------- ----------
| value| date|
---------- ----------
|2022-01-01|2022-01-01|
---------- ----------