Home > Mobile >  How to Write a .hyper file (in DataBricks) to Blob Storage (in Azure)?
How to Write a .hyper file (in DataBricks) to Blob Storage (in Azure)?

Time:08-24

I have created a .hyper file using https://github.com/goodwillpunning/hyperleaup. I want to save the hf file created to an existing azure blob storage instead of publishing it to a tableau server. How do I achieve that using Python(pyspark) in databricks?

CodePudding user response:

You need to create an instance of the HyperFile with the is_dbfs_enabled parameter set to True, and then use the save function with path parameter pointing to the destination on DBFS, but without dbfs:/ prefix - for example, /FileStore/

  • Related