Showing all blobs in a (foreign) container is possible with the code below, so I know the provide SAS-url is valid
from azure.storage.blob import ContainerClient, BlobServiceClient
sas_url = r'[the sas_token]'
container = ContainerClient.from_container_url(sas_url)
blob_list = container.list_blobs()
for blob in blob_list:
print(blob.name)
How do I download the contents of the container to a local folder? With our own containers I would connect with a BlobServiceClient using the provided connection-string, which I don't have for this container.
CodePudding user response:
You are almost there. All you need to do is create BlobClient
from ContainerClient
and blob name using get_blob_client
method. Once you have that, you will be able to download the blob using download_blob
method.
Your code would be something like:
sas_url = r'[the sas_token]'
container = ContainerClient.from_container_url(sas_url)
blob_list = container.list_blobs()
for blob in blob_list:
print(blob.name)
blob = container.get_blob_client(blob.name)
blob.download_blob();
Please ensure that your SAS URL has Read
permission otherwise download operation will fail.
CodePudding user response:
If someone else tries to save csv's from a blob here is the code is used with Gaurav's help
sas_url = r'[SAS_URL]'
sas_token = r'[SAS_token]'
container = ContainerClient.from_container_url(sas_url)
blob_service_client = BlobServiceClient(account_url="[ACCOUNT NAME]", credential=sas_token)
blob_list = container.list_blobs()
for blob in blob_list:
name = blob.name
length = len(name)
nr = length - name.rfind('/') - 1
filename = name[-nr:]
if name[-4:] == '.csv':
try:
blob_client = blob_service_client.get_blob_client(account_url='[CONTAINER]', blob=name)
blob_data = blob_client.download_blob()
file = blob_data.readall()
file = pd.read_csv(BytesIO(file))
file.to_csv(filename)
except:
Exception