Home > Back-end >  SELECT as is_boolean in Pyspark?
SELECT as is_boolean in Pyspark?

Time:11-30

In snowflake/SQL you can do something like:

SELECT my_col = 'yes' as is_my_col FROM my_table

This will select it as a bool value, how can I do this in pyspark if I have a df?

Such as my_df

my_df.select(??.cast('bool'))

CodePudding user response:

my_df.select((F.col('my_col') == 'yes').alias('is_my_col'))
  • Related