Home > Software design >  using a sql request in spark sql error in execution
using a sql request in spark sql error in execution

Time:12-29

I try to execute this query in pyspark i get all the time error. I have looked everywhere but I don't know or it doesn't work if someone can help me. the goal of this request is to update a new column that I will later create called temp_ok : this my code:

CASE WHEN _temp_ok_calculer='non' AND Operator level 2 ="XXX" OR  Operator level 2= "AAA" AND Auto Cleaning criteria !="YYY" Auto Cleaning criteria <> "AA"  AND Workstation Type = "Chaine A" THEN 'ok' ELSE CASE WHEN _temp_ok_calculer='ok' THEN 'ok' ELSE 'ko' END END

My table contains this columns: _temp_ok_calculer,Operator level 2, Auto Cleaning criteria, Workstation Type

CodePudding user response:

Spark SQL uses back-tick ` for delimiters, and identifiers with spaces in them must be delimited identifiers in SQL. So

CASE WHEN _temp_ok_calculer='non' AND `Operator level 2` ="XXX" OR . . .
  • Related