Home > database >  Using progress handler when uploading files to AWS S3 with React
Using progress handler when uploading files to AWS S3 with React

Time:11-24

I am only recently dealing with the AWS SDK and thus please excuse if my approach is complete nonsense.

I want to upload a simple media file to my S3. I was following this tutorial and so far I am able to upload files without a problem. For userbility a progress bar would be a nice extra and therefore I was researching how to achieve this. I quickly found that the current AWS SDK v3 does not support httpUploadProgress anymore but we should use @aws-sdk/lib-storage instead. Using this library, I am still able to upload files to the S3 but I can't get the progress tracker to work! I assume this has something to do with me not fully understanding how to deal with async within a React component.

So here is my minified component example (I am using Chakra UI here)

const TestAWS: React.FC = () => {
  const inputRef = useRef<HTMLInputElement | null>(null);
  const [progr, setProgr] = useState<number>();

  const region = "eu-west-1";
  const bucketname = "upload-test";

  const handleClick = async () => {
    inputRef.current?.click();
  };

  const handleChange = (e: any) => {

    console.log('Start file upload');

    const file = e.target.files[0];
    const target = {
      Bucket: bucketname,
      Key: `jobs/${file.name}`,
      Body: file,
    };

    const s3 = new S3Client({
      region: region,
      credentials: fromCognitoIdentityPool({
        client: new CognitoIdentityClient({ region: region }),
        identityPoolId: "---MY ID---",
      }),
    });

    const upload = new Upload({
      client: s3,
      params: target,
    });

    const t = upload.on("httpUploadProgress", progress => {
      console.log("Progress", progress);

      if (progress.loaded && progress.total) {
        console.log("loaded/total", progress.loaded, progress.total);
        setProgr(Math.round((progress.loaded / progress.total) * 100)); // I was expecting this line to be sufficient for updating my component
      }
    });
    await upload.done().then(r => console.log(r));
  };

console.log('Progress', progr);

return (
    <InputGroup onClick={handleClick}>
      <input ref={inputRef} type={"file"} multiple={false} hidden accept='video/*' onChange={e => handleChange(e)} />
      <Flex layerStyle='uploadField'>
        <Center w='100%'>
          <VStack>
            <PlusIcon />
            <Text>Choose Video File</Text>
          </VStack>
        </Center>
      </Flex>
      {progr && <Progress value={progr} />}
    </InputGroup>
  );
};

export default TestAWS;

So basically I see the event getting fired (start file upload). Then it takes a while and I see the Promise result and the Progress, 100 in my console. This means to me that the state variable gets updated (at least once) but the component does not re-render?

What is it what I am doing wrong here? Any help appreciated!

CodePudding user response:

I came across your answer after having exactly the same problem (with Vue) today!

Indeed you are right: the AWS SDK JS v3 event only fires per part which is not at all clear and I wasted time debugging that too. Like for a 4MB file, it would only ever fire at 100%.

As you say, you can experiment with the part size but the minimum is 5MB and so on a slow connection I found it can appear that an upload is stuck as you have to wait for 5MB to get any data. Hmm. So what I did was look at the size of the file being uploaded. And if it is under a threshold (say 25MB, or whatever is applicable), well it's probably safe to upload that all in one go as you don't really need multipart uploading. And so I also made a presigned URL (https://aws.amazon.com/blogs/developer/generate-presigned-url-modular-aws-sdk-javascript/) which can be used to PUT using axios (since fetch does not support progress events yet).

So that way you can use upload for large files (where you actually need multipart uploading and where 5MB as a percentage of the file size is small), and use a presigned URL for small files and so get much more frequent updates.

The same progress event handler can be used by both.

this.$axios
  .request({
     method: "PUT",
     url: SIGNED-URL-HERE,
     data: file,
     timeout: 3600 * 1000,
     onUploadProgress: this.uploadProgress,
  })
  .then((data) => {
     console.log("Success", data);
  })
  .catch((error) => {
     console.log("Error", error.code, error.message);
  });

Not ideal but it helps.

CodePudding user response:

Alright, I have found the solution. The callback on the state variable works fine and does what it should. But the configuration of the Upload object was off. After digging into the source I found out that the event listener only gets triggered if the uploader has uploaded more data. Because Uploader chunks the uploads you have two separate config parameters which allow you to split your upload into separate chunks. So

const upload = new Upload({
  client: s3,
  params: target,
  queueSize: 4,          // 4 is minimum
  partSize: 5*1024*1024  // 5MB is minimum
});

basically does the job when the file we upload is larger than 5MB! Only then the event gets triggered again and updates the state variable.

Since this uploader is made for handling large file uploads, this totally makes sense and we could simply adjust queueSize and partSize according to the file we want to upload. Something like

let queueSize = 10;
const file = event.target.files[0];

let partSize = file.size / (10 * 1024 * 1024);    // 1/10th of the file size in MB

const upload = new Upload({
  client: s3,
  params: target,
  queueSize: partSize > 5 queueSize : undefined,
  partSize: partSize > 5 ? partsize : undefined
});

Obviously, this can be done much more sophisticated but I did not want to spend too much time on this since it is not part of the original question.

Conclusion

If your file is large enough (>5MB), you will see progress update, depending on the number of chunks (of 5MB or more) you have chosen to split your file.

Since this only affects the handleChange method from the original example, I post this for completeness

const handleChange = async ( event ) => {
  const file = event.target.files[0]

  const target = {
    Bucket: 'some-S3-bucket',
    Key: `jobs/${file.name}`,
    Body: file,
  };

  const s3 = new S3Client({
    region: 'your-region',
    credentials: fromCognitoIdentityPool({
      client: new CognitoIdentityClient({ region: 'your-region' }),
      identityPoolId: "your-id",
    }),
  });

  // this will default to queueSize=4 and partSize=5MB
  const upload = new Upload({
    client: s3,
    params: target
  });

  upload.on("httpUploadProgress", progress => {
    console.log('Current Progress', progress);
    setProgr(progress);
  });

  await upload.done().then(r => console.log(r));
} 

Maybe this helps someone who has the same problem.

  • Related