Home > Enterprise >  Azure BlobClient not writing json correctly to Storage
Azure BlobClient not writing json correctly to Storage

Time:10-06

I have a json from a API response that is stored in a var.

When I print this var I get:

print(json)
{"items": [{"id": "123", "name": "A"}, {"id": "456", "name": "B"}], "count": 2}
print(type(json))
<class 'dict'>

This var I am trying to upload as a blob to a Azure Blob Container like:

blob_name = 'test.json'
blob_url = f"{account_url}/{container_name}/{blob_name}"
sa_credential = DefaultAzureCredential()

blob_client = BlobClient.from_blob_url(
    blob_url=blob_url,
    credential=sa_credential
)

blob_client.upload_blob(json)

Everything good, so far, a file is created in the container, however, the content of the file is:

items=id&items=name&items=id&items=name&count=2

Where I would expect it to be:

{"items": [{"id": "123", "name": "A"}, {"id": "456", "name": "B"}], "count": 2}

Any ideas?

CodePudding user response:

I think upload_blob can only takes string, byte or IO object instances as an argument. The documentation does not specify what type the data argument should be. But from the source code data is typed as Union[Iterable[AnyStr], IO[AnyStr]]

def _upload_blob_options(  # pylint:disable=too-many-statements
            self, data,    # type: Union[Iterable[AnyStr], IO[AnyStr]]

Where AnyStr is a type variable defined as AnyStr = TypeVar('AnyStr', str, bytes). So one way to solve this issue is to pass the JSON string instead of the Python dictionary.

blob_client.upload_blob(json.dumps(json))
  • Related