Home > other >  How to use Spark SQL query to filter Chinese column name?
How to use Spark SQL query to filter Chinese column name?

Time:11-25

I am running the following spark sql and it will get the all data:

scala> spark.sql("select * from t1").show()
 ------ ---- ------- 
|  名稱|年齡|address|
 ------ ---- ------- 
|jeremy|  33| Taipei|
|  Mary|  18| Taipei|
|  John|  28|    XXX|
|  大明|  29|    YYY|
|  小黃|  19|    ZZZ|
 ------ ---- ------- 

But when I add a filter, name it to 名稱, Spark SQL doesn't recognize it.

scala> spark.sql("select * from t1 where 名稱=='jeremy'").show()
org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '名' expecting <EOF>(line 1, pos 23)

== SQL ==
select * from t1 where 名稱=='jeremy'
-----------------------^^^

  at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
  at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:643)
  ... 49 elided


scala> spark.sql("select * from t1 where '名稱'=='jeremy'").show()
 ---- ---- ------- 
|名稱|年齡|address|
 ---- ---- ------- 
 ---- ---- ------- 

Does anybody know how to do this?

Thanks

CodePudding user response:

You need to use backticks(`).

spark.sql("select * from t1 where `名稱`=='jeremy'").show()
  • Related