Home > Software engineering >  read files from hdfs using spark(Scala)
read files from hdfs using spark(Scala)

Time:05-24

please tell me how to read files from hdfs. I'm just starting to work with Scala and Spark. I can read a separate file that lies in the folder:

val parqDF = spark.read.parquet("hdfs://nn1home:8020/user/stg/ads/year=2020/month=1/day=1/16_data.0.parq")

but I would like to read whole folder with all the parquets

and also, one more important question, how can I add columns to my dataframe with data from the path where there are my parquets

my thanks for any advice

CodePudding user response:

import org.apache.spark.sql.functions.lit
val inputPath = "<you path>"
val dataDF = spark.read.parquet(inputPath)
dataDF.printSchema()
//    root
//    |-- _c0: string (nullable = true)
//    |-- _c1: string (nullable = true)
//    |-- _c2: string (nullable = true)
//    |-- _c3: string (nullable = true)

val resDF = dataDF.withColumn("new_col", lit(inputPath))
resDF.printSchema
//    root
//    |-- _c0: string (nullable = true)
//    |-- _c1: string (nullable = true)
//    |-- _c2: string (nullable = true)
//    |-- _c3: string (nullable = true)
//    |-- new_col: string (nullable = false)

resDF.schema
//    res2: org.apache.spark.sql.types.StructType = StructType(
//      StructField(_c0,StringType,true), 
//      StructField(_c1,StringType,true), 
//      StructField(_c2,StringType,true), 
//      StructField(_c3,StringType,true), 
//      StructField(new_col,StringType,false)
//    )

// resDF.show(false) - show data dataframe
  • Related