Home > Mobile >  How to copy all files and folders of a different azure directory using databricks (without Azure dat
How to copy all files and folders of a different azure directory using databricks (without Azure dat

Time:10-17

i want to connect to folder in other azure directory from Databricks. I have access keys but it is not working. What is other way to do it?

spark.conf.set( "fs.azure.account.key.<storageAccountName>.dfs.core.windows.net","accessKey")

df=spark.read.csv("abfss://[email protected]/folder/")
display(df)

Error

AbfsRestOperationException: Operation failed: "Server failed to authenticate the request. Please refer to the information in the www-authenticate header.", 401, HEAD, https://<storageAccount>.dfs.core.windows.net/folder/?upn=false&action=getAccessControl&timeout=

CodePudding user response:

Changing cluster make same code worked. Not sure of the reason though

  • Related