Home > Net >  PDF Not Working After Uploading To Node Server
PDF Not Working After Uploading To Node Server

Time:07-06

I am trying to upload a pdf from the frontend to my node server. The PDF successfully uploads on the node server but when I go to open it, I am unable to. Instead, I see a message that says "File cant be opened. Something went wrong." Why is this happening?

Also please dont suggest third party pdf uploaders like multer, etc. I am aware of these third party libraries but I just want pure node. Thank you so much.

Frontend code:

const uploadFile = document.getElementById("uploadFile");

uploadFile.addEventListener("change", (event) => {
  readFile(event.target.files[0]);
});

function readFile(file) {
  const uploadDesignPDF = `http://localhost:7000/api/upload/design`;
  let fileReader = new FileReader();
  fileReader.readAsDataURL(file);
  fileReader.addEventListener("load", async (event) => {
    let pdfStrChunk = event.target.result.replace(
      /^data:application\/[a-z] ;base64,/,
      ""
    );
    let fileSize = file.size;
    const chunk = 85000;
    let numOfChunkSet = Math.ceil(fileSize / chunk);
    let remainingChunk = fileSize;
    let currentChunk = 0;
    let chunkSet = [];
    let range = {};
    let data = {};

    for (let i = 0; i < numOfChunkSet; i  ) {
      remainingChunk -= chunk;

      if (remainingChunk < 0) {
        remainingChunk  = chunk;
        chunkSet.push(remainingChunk);
        range.start = currentChunk;
        range.end = currentChunk   chunk;
        currentChunk  = remainingChunk;
      } else {
        chunkSet.push(chunk);
        range.start = currentChunk;
        range.end = (i   1) * chunkSet[i];
        currentChunk  = chunk;
      }

      const chunkRead = pdfStrChunk.slice(range.start, range.end);
      data.dataPDF = chunkRead;

      let response = await fetch(uploadDesignPDF, {
        method: "POST",
        body: JSON.stringify(data),
        headers: {
          "Content-Type": "application/json",
        },
        responseType: "arrayBuffer",
        responseEncoding: "binary",
      });
      let results = await response.json();
      console.log(results);
    }
  });
}

Backend route:

const { uploadDesigns } = require("./upload.designs.controller.js");
const router = require("express").Router();

router.post("/upload/design", uploadDesigns);

Backend:

  uploadDesigns: async (req, res) => {
    try {
      fs.writeFileSync(`./designs/testingPDF6.pdf`, req.body.dataPDF, "base64");
      res.status(200).json({
        message: "done with chunk",
      });
    } catch (error) {
      res.status(500).json({
        message: "Something went wrong. Please refresh page.",
      });
    }
  }

CodePudding user response:

You are working with base64-URL in vain. It is much more effective to use ArrayBuffer. The main advantage of ArrayBuffer is the 1-byte unit, while base64 breaks the byte representation three out of four times.

Instead of sending the file in chunks, I would suggest tracking progress through XMLHttpRequest.upload.onprogress(). I would only use chunks if the upload is through a WebSocket.

If the PDF file is the only information sent to the server, I'd prefer to send the file directly without any field names or other FormData information provided. In that case, it would be appropriate to change the POST method to PUT.

If you prefer to send the file directly, it would be ideal to use fs.createWriteStream() instead of fs.writeFileSync(). Then this approach will work

  const ws = fs.createWriteStream(tmpFilePath);
  request.pipe(ws);

To control the integrity of the data, you can add md5 or sha hash to the request headers and, on the server, duplicate the data stream into the object created by crypto.createHash(). In case of a hash mismatch, the file can be uploaded again.

  • Related