Home > Enterprise >  Error while reading a folder in azure databricks which has subfolders with parquet files
Error while reading a folder in azure databricks which has subfolders with parquet files

Time:11-09

I am reading a folder in adls in azure databricks which has sub folders containing parquet files.

path - base_folder/filename/

filename has subfolders like 2020, 2021 and these folders again have subfolders for month and day.

So path for actual parquet file is like - base_folder/filename/2020/12/01/part11111.parquet.

I am getting below error if I give a base folder path. enter image description here

I have tried commands in below tread as well but it is showing same error. enter image description here

Please help me to read all parquet files in all sub folders in one dataframe.

CodePudding user response:

Try with:

spark.read.format("parquet").load(landingFolder)

as specified here: Generic Load/Save Functions

  • Related