Home > other >  How to download excel file with large amount of data?
How to download excel file with large amount of data?

Time:08-05

There is an API that downloads Excel files. Whenever we click on the 'Download excel report' button, it calls this API which gets data from the database, transfers it to an Excel file, and downloads it.

excelName = key   ".xlsx";

response.setHeader("Content-Disposition", "attachment; filename="   excelName);
    
ByteArrayInputStream in = getDownloadBusinessAnalysisKey(key, customerScopeId, response, getBusinessKeyResult, sheetName);
    
response.setHeader("Content-Length", String.valueOf(in.available()));
try {
  FileCopyUtils.copy(in, response.getOutputStream());
} catch (IOException e) {
  logger.error("Error in report "   e.getLocalizedMessage());
}

It downloads fine when there are about 10,000 records (or close to it).

However, when there are about 100,000 records it just doesn't download. I get no response from the endpoint. It works when trying to download the same file via Postman in local database.

Is there some limit on download size? Would this be a browser, Tomcat server, or HTTP header issue?

CodePudding user response:

When you're using a ByteArrayInputSteam you're loading the whole data into memory. What needs to be done is to make getDownloadBusinessAnalysisKey store to a place you can stream, usually to a file. Then use FileInputStream to read the file and write it to the response output stream.

Based on your statement that you're using an API to download the Excel file, I am assuming it is not done inside your current JVM (otherwise I'd recommend you look up SXSSF which would require a bit of code change and rethinking to get it to work.

From your code sample it looks like you're using Servlet API as well based on the setHeader. So here's a bit of code that would make a get connection and more or less proxy it up. There's no need for a temporary file if it is just straight through, there's also no need to buffer unless you can confirm that the servlet engine does not provide you with buffered data.

protect void doGet(
    HttpServletRequest req, 
    HttpServletResponse response)
  throws ServletException, IOException {
  var url = new URL("http://myapi");
  HttpURLConnection con = (HttpURLConnection) url.openConnection();
  con.setRequestMethod("GET");
  int responseCode = con.getResponseCode();
  int len = con.getHeaderFieldInt("Content-Length",-1);
  int contentType = con.getHeaderField("Content-Type", "application/octet-stream");

  assert responseCode == 200

  response.setIntHeader("Content-Length", len);
  response.setHeader("Content-Type", contentType);
  try (
    InputStream is = con.getInputStream();
    OutputStream os = response.getOutputStreeam();
  ) {  
    int c = is.read();
    while (c != -1) {
      os.write(c);
      c = is.read();
    }
  }
}

This can be optimized by using Async Servlet APIs to reduce blocking I/O which would be needed if you are dealing with a lot of connections.

CodePudding user response:

I think that the process to get data from db and fetch into excel taking too long time and it is longer than max response time of server. You should tuning the process getting data from db and fetching into excel, it will solve your issue.

  • Related