Home > Blockchain >  How to solve "failed to parse Content-Range Header" in google drive API for performing res
How to solve "failed to parse Content-Range Header" in google drive API for performing res

Time:08-07

I am working on a code to perform resumable uploads on google drive via google drive API. I am stuck with the error "Failed to parse Content-Range-Header". Here is a sample code that I have.

const new_access_token = await refreshAccessToken() # Function to fetch new access token every time so as to avoid the error of expiration of access token
const res = await axios.put(resumable_url,{
    body: data
},{
    headers:{
        "Authorization": `Bearer ${new_access_token}`,
        "Content-Length": 256000,
        "Content-Range": `bytes 0-255999/2000000`
    }
})
console.log(res)

I am trying to upload the first 256KB of the file which is basically 2000000 bytes. I have passed a parameter called data which is basically an array of buffer of 256000 bytes. Small chunk of bigger file. It looks something like

<Buffer 0,0,0,32,102,116,121,112,105,15,111,109,0,0,2,0,105,115,111,109,105,115,111,50,97,118,99,49,109,112,52,49,0 ....

Later I will be modify the code to make multiple put requests to add the complete file.

I have successfully fetched a new access_token with refresh_token and resumable_session_uri. But I don't know where I am doing wrong. I have tried to use wildcard as well in Content-Range which we use when we don't know the size of file but unable to resolve the error?

CodePudding user response:

I believe your goal is as follows.

  • You want to achieve the resumable upload using Drive API using Axios with Node.js.
  • You have already had the valid access token for uploading a file to Google Drive.

Modification points:

  • When I saw your showing script, I'm worried that you might have misunderstood the resumable upload using Drive API. The flow of resumable upload is as follows.

    1. Retrieve the endpoint for uploading data. Here, the access token is used.
    2. Upload chunks using the endpoint. Here, it is not required to use the access token.
  • I thought that your chunk size of 0-255999 is required to be modified. I think that when the resumable upload is run using your chunk size, the following error message is shown.

    The number of bytes uploaded is required to be equal or greater than 262144, except for the final request (it's recommended to be the exact multiple of 262144).

  • And, in your script, it seems that data is not split as the chunk.

I thought that these are the reason for your current issue. When these points are reflected in a sample script, it becomes as follows.

Sample script:

In this sample script, as a sample situation in order to explain the resumable upload, the file data is loaded from the local PC, and the data is uploaded to Google Drive with the resumable upload.

const filepath = "./###"; // Please set the filename and file path of the upload file.

const new_access_token = "###"; // Please set your access token.
const name = "###"; // Please set the filename on Google Drive.
const mimeType = "###"; // Please set the mimeType of the uploading file. I thought that when this might not be required to be used.

// 1. Prepare chunks from loaded file data.
const split = 262144; // This is a sample chunk size.
const data = await fs.readFile(filepath);
const fileSize = data.length;
const array = [...new Int8Array(data)];
const chunks = [...Array(Math.ceil(array.length / split))].map((_) => Buffer.from(new Int8Array(array.splice(0, split))));

// 2. Retrieve the endpoint for uploading a file.
const res1 = await axios({
  method: "POST",
  url: "https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable",
  headers: {
    Authorization: `Bearer ${new_access_token}`,
    "Content-Type": "application/json",
  },
  data: JSON.stringify({ name, mimeType }),
});
const { location } = res1.headers;

// 3. Upload the data using chunks.
let start = 0;
for (let i = 0; i < chunks.length; i  ) {
  const end = start   chunks[i].length - 1;
  const res2 = await axios({
    method: "PUT",
    url: location,
    headers: { "Content-Range": `bytes ${start}-${end}/${fileSize}` },
    data: chunks[i],
  }).catch(({ response }) => console.log({ status: response.status, message: response.data }));
  start = end   1;
  if (res2?.data) console.log(res2?.data);
}
  • In this sample script, please use the following modules.

      const axios = require("axios");
      const fs = require("fs").promises;
    

Testing:

When this sample script is run, the following result is obtained.

When the file upload is not finished, the following message is shown.

{ status: 308, message: '' }

When the file upload is finished, the following message is shown.

{
  kind: 'drive#file',
  id: '###',
  name: '###',
  mimeType: '###'
}

Reference:

  • Related