Home > Blockchain >  Cast Spark dataframe existing schema at once
Cast Spark dataframe existing schema at once

Time:11-02

I have a dataframe that all of its columns are of String type, and I have a schema that contains the wanted type for each column. Is there any way of inserting the conversion into a one big try/ catch clause and covert the whole schema dynamically at once? The only solution I've seen is to handle each column specifically and convert its type.

CodePudding user response:

Try:

val newDf = sparkSession.createDataFrame(oldDf.rdd, schema)
  • Related