Home > Net >  Chrome ERR_INSUFFICIENT_RESOURCES workaround
Chrome ERR_INSUFFICIENT_RESOURCES workaround

Time:11-08

I have a JS library that is responsible to perform the download of JPEG images for the client. All of this is done asynchronously. In some cases, the count of images is really large... Around 5000 images. In this case, the Chrome browser issues the "ERR_INSUFFICIENT_RESOURCES" error for the ajax request.

Each request must be done individually, there is no option to pack the images on the server-side.

What are my options here? How can I find a workaround for this problem? The download works fine in Firefox...

Attached code of the actual download:

function loadFileAndDecrypt(fileId, key, type, length, callback, obj) {     

 var step = 100 / length;

 eventBus.$emit('updateProgressText', "downloadingFiles");


  var req = new dh.crypto.HttpRequest();
    req.setAesKey(key);
    let dataUrl;
    
    if (type == "study")  {
         dataUrl = "/v1/images/";
    }else {
         dataUrl = "/v1/dicoms/";
    }    

    var url = axios.defaults.baseURL   dataUrl   fileId;
    req.open("GET", url, true);
    req.setRequestHeader("Authorization", authHeader().Authorization "")
    req.setRequestHeader("Accept", "application/octet-stream, application/json, text/plain, */*");
    req.responseType = "arraybuffer";

    req.onload = function() {
        console.log(downloadStep);    
        downloadStep  =  step;
        eventBus.$emit('updatePb', Math.ceil(downloadStep));

        var data = req.response;
        obj.push(data);
        counter   ;        
        //last one
        if (counter == length) {        
            callback(obj);        
        }
    };

    req.send();
}

CodePudding user response:

The error means your code is overloading your memory (most likely). Instead of sending all the data from the backend, make your frontend request for 5000 individual images instead and control the requests flow. regardless, downloading 5000 images is bad. You should pack them up for downloading. If you mean fetching the images, then loading images from the frontend through static or dynamic links is much more logical ;)

CodePudding user response:

Create a class:

  • Which accepts the file-Id (image that needs to be downloaded) as an argument
  • Which can perform the HTTP API request
  • Which can store the result of the request

Create an array of objects from this class using how many ever file-Ids that needs to be downloaded.

Store the array in a RequestManager which can start and manage the downloads:

  • can batch the downloads, say it fires 5 requests from the array and waits for them to finish before starting the next batch
  • can stop the downloads on multiple failures
  • manipulate batch size depending on the available bandwidth
  • stops download on auth expiry and resumes on auth refresh
  • offers to retry the previously failed downloads
  • Related