I, want to dynamically get all storage account and containers within from an azure subscription by databricks.
So, that I can go through each file within a container and get the files and their sizes which I have done earlier.
Now I want to dynamically set my storage account and container to process within from my databricks environment.
CodePudding user response:
Per my experience and based on my understanding for all operations in storage account with databricks, authentication is happening in azure storage account level . In that case , If you are trying to access storage account through service principal or storage account access key both are in storage account level , you can list out the containers within storage account .But we don't have option to list out the storage account within subscription . As workaround , you can use powershell
, to get the storage accounts within subscription and pass those value for your logic.
You can use below code , to get the list of containers within storage account .
from azure.storage.blob.blockblobservice import BlockBlobService
blob_service = BlockBlobService(account_name='storageaccount', account_key='accesskey')
containers = blob_service.list_containers()
for c in containers:
print(c.name)