I have a column in string format like below:
["name": "XXX","active": true,"locale": "EN","Channel":["1","2"]]
I would like to explode them like below in spark sql(preserving the quotes in string values).
This is code I used:
SELECT EXPLODE(from_json(col, 'map<string, string>>'))
FROM XXX;
I am not able to preserve the quotes in "XXX" and "EN" after exploding.
This is what I want:
key | value |
---|---|
name | "XXX" |
active | true |
locale | "EN" |
Channel | [1,2] |
CodePudding user response:
The quotes are part of the JSON representation of the data and not the data itself. If there were embedded quotes in the data it would look like:
"\"SOME DATA\""
If you need to add quotes on strings, you can always concatenate them to the specific columns. You can use the concat operator to accomplish this, https://spark.apache.org/docs/latest/api/sql/index.html#concat
Alternatively, you can use get_json_object, which allows you to extract specific parts of a JSON object. https://spark.apache.org/docs/3.1.2/api/python/reference/api/pyspark.sql.functions.get_json_object.html