Home > Net >  How to create schema in Spark with Scala if more than 100 columns in the input?
How to create schema in Spark with Scala if more than 100 columns in the input?

Time:10-01

with case class we have some restrictions... with StructType is it possible to for 100 columns, Is there any other way to create scheme for around 600 columns.

CodePudding user response:

val columns = (1 to 600).map(i => s"Column_$i").map(cname => StructField(cname, StringType))
val schemaWithSixHundredsColumns = StructType(columns)
val df = spark.createDataFrame(new java.util.ArrayList[Row](), schemaWithSixHundredsColumns)
  • Related