Home > Software engineering >  Uploading large file with @azure/storage-blob using blockBlobClient.uploadBrowserData with sas key
Uploading large file with @azure/storage-blob using blockBlobClient.uploadBrowserData with sas key

Time:08-03

I'm trying to upload a ~700MB file to the blob storage directly from the front end using the @azure/storage-blob npm package. In order to do this, I'm making a call to our backend to generate an SAS key to authenticate the upload, then using blockBlobClient.uploadBrowserData to do the upload. This is working for smaller files but is failing for the very large file.

I think the issue is that blockBlobClient.uploadBrowserData is breaking up the upload into multiple blocks if the file size is over 250MB, and only the upload of the first block is able to use the sas key provided from the backend, as I eventually get a 403 (This request is not authorized to perform this operation using this permission.) error in the console.

Is there something I can do that will allow me to upload large files using an sas key generated in the backend?

Here is the code im using to retrieve the sas key from the backend and upload the file to blob:

    axios
      .all(
        workflowEndpoints.map((workflowEndpoint) => axios.get(workflowEndpoint))
      )
      .then((responses) => {
        responses.forEach((response, i) => {
          const { data } = response;
          const file = files[i];
          const fileName = fileNames[i];
          const blobServiceClient = new BlobServiceClient(
            `https://<storage_account_name>.blob.core.windows.net?${data.key}`
          );
          const containerClient = blobServiceClient.getContainerClient(
            data.container
          );
          const blockBlobClient = containerClient.getBlockBlobClient(fileName);
          blockBlobClient.uploadBrowserData(file).catch(() => {
            setStatus('Error: Upload failed.');
            setLockUser(false);
            throw new Error('Error: Upload failed.');
          });
        });
        if (gatherUploadFileData) {
          gatherUploadFileData(fileNames, multiple, inputId);
        }
        setStatus('Success: Upload complete!');
        setLockUser(false);
        setFiles([]);
      })
      .catch(() => {
        setStatus('Error: Upload failed.');
        setLockUser(false);
        throw new Error('Error: Upload failed.');
      });
    ```

CodePudding user response:

This is not expected. I just tried a small code snippet and it works as expected (all operations are using the same SAS key)

const blockBlobClient = containerClient.getBlockBlobClient("block-blob-name");
const content = new Uint8Array(20 * 1024 * 1024);
for (let i = 0; i < content.length; i  ) {
    content[i] = Math.floor(Math.random() * 256);
}
const file = new Blob([content]);
file.name = "block-blob-name";
await blockBlobClient.uploadBrowserData(file, {
    maxSingleShotSize: 2 * 1024 * 1024,
});

Do you have any proxy between your machine and Azure Blob endpoint? Sometimes proxy/cache servers can lead to strange behavior.

It may also be worth to log an issue at https://github.com/Azure/azure-sdk-for-js

CodePudding user response:

Turns out the permissions that blockBlobClient.uploadBrowserData needs to upload files in blocks is different than the smaller files it uploads in one go. When generating the SAS key, I originally had only the create permission which worked with small files but not with larger files. After adding add and write permissions to the SAS token it worked. It also seems to require the SAS token to be valid during the whole duration of the block upload process, so I also had to extend the expiration time.

  • Related