I am creating dataframe from a python list as below,
_test = [('val1', {'key1': ['A', 'B'], 'key2': ['C'], 'bool_key1': True, 'bool_key2': True}),
('val2', {'key1': ['B'], 'key2': ['D'], 'bool_key1': False, 'bool_key2': None})]
df_test = spark.createDataFrame(_test, schema = ["col1","col2"])
df_test.show(truncate=False)
but, dataframe resulted has all boolean fields as null!
---- ---------------------------------------------------------
|col1|col2 |
---- ---------------------------------------------------------
|val1|[key1 -> [A, B], bool_key2 ->, key2 -> [C], bool_key1 ->]|
|val2|[key1 -> [B], bool_key2 ->, key2 -> [D], bool_key1 ->] |
---- ---------------------------------------------------------
df_test dataframe schema
root
|-- col1: string (nullable = true)
|-- col2: map (nullable = true)
| |-- key: string
| |-- value: array (valueContainsNull = true)
| | |-- element: string (containsNull = true)
any help I can create dataframe without changing the python variable structure?
CodePudding user response:
Define the schema and dont define the rows using a tuple. Use a list. try the following code
_test1 = [["val1",{"key1": ["A", "B"], "key2": ["C"], "bool_key1": True, "bool_key2": True}],
["val1",{"key1": ["A", "B"], "key2": ["C"], "bool_key1": True, "bool_key2": True}],
["val2", {"key1": ["B"], "key2": ["D"], "bool_key1": False, "bool_key2": None}]]
df2=spark.createDataFrame(_test1, 'col1 string, col2 struct<key1:array<string>,key2:array<string>,bool_key1:boolean,bool_key1:boolean>')
df2.show(truncate=False)
---- -------------------------
|col1|col2 |
---- -------------------------
|val1|{[A, B], [C], true, true}|
|val1|{[A, B], [C], true, true}|
|val2|{[B], [D], false, false} |
---- -------------------------
root
|-- col1: string (nullable = true)
|-- col2: struct (nullable = true)
| |-- key1: array (nullable = true)
| | |-- element: string (containsNull = true)
| |-- key2: array (nullable = true)
| | |-- element: string (containsNull = true)
| |-- bool_key1: boolean (nullable = true)
| |-- bool_key1: boolean (nullable = true)
CodePudding user response:
Adding to @wwnde 's answer, there's another way of defining the struct schema (though would personally prefer @wwnde 's answer (fewer lines of code)) -
Defining the struct
schema -
from pyspark.sql.types import *
schema = StructType(
[
StructField("col1", StringType()),
StructField("col2", StructType([
StructField("key1", ArrayType(StringType())),
StructField("key2", ArrayType(StringType())),
StructField("bool_key1", BooleanType()),
StructField("bool_key2", BooleanType())
]
)
)
]
)
Creating the dataframe
-
_test = [
('val1', {'key1': ['A', 'B'], 'key2': ['C'], 'bool_key1': True, 'bool_key2': True}),
('val2', {'key1': ['B'], 'key2': ['D'], 'bool_key1': False, 'bool_key2': None})
]
df=spark.createDataFrame(data=_test, schema=schema)
df.printSchema()
Output
root
|-- col1: string (nullable = true)
|-- col2: struct (nullable = true)
| |-- key1: array (nullable = true)
| | |-- element: string (containsNull = true)
| |-- key2: array (nullable = true)
| | |-- element: string (containsNull = true)
| |-- bool_key1: boolean (nullable = true)
| |-- bool_key2: boolean (nullable = true)
If you want to have the MapType
key
value
pairs intact try using below logic -
_test = [
('val1', {'key1': ['A', 'B'], 'key2': ['C'], 'bool_key1': True, 'bool_key2': True}),
('val2', {'key1': ['B'], 'key2': ['D'], 'bool_key1': False, 'bool_key2': None})
]
schema = StructType([
StructField("col1", StringType()),
StructField("col2", (MapType(StringType(), StringType())))
])
spark.createDataFrame(_test, schema=["col1", "col2"]).show(truncate=False)
df_test = spark.createDataFrame(data = _test, schema = schema)
df_test.show(truncate=False)
---- -------------------------------------------------------------------
|col1|col2 |
---- -------------------------------------------------------------------
|val1|{key1 -> [A, B], bool_key2 -> true, key2 -> [C], bool_key1 -> true}|
|val2|{key1 -> [B], bool_key2 -> null, key2 -> [D], bool_key1 -> false} |
---- -------------------------------------------------------------------