Home > Software engineering >  Nodejs: how can I wait for pipe and createWriteStream to finish before executing next lines?
Nodejs: how can I wait for pipe and createWriteStream to finish before executing next lines?

Time:02-11

I'm new to Nodejs. I am saving a zip file from S3 bucket to EC2 instance which is hosting the nodejs service, and then I am decompressing the zip file locally in the file system in EC2:


  const s3Item = await S3.getFile(objectBucket, objectKey);
  let stream = s3Item.Body.pipe(fs.createWriteStream(sourceFilePath, { mode: 0o777 }));
  stream.on('finish', () => {
    logger.info(` pipe done `);  
  });


  logger.info(`start decompressing...`);
  await decompressArchive(sourceFilePath, targetDirectory_decompressed);

  ... a lot of code...

However, start decompressing... is always printed before pipe done is printed.

How can I make these two steps synchronous such that we wait until pipe done, and then start to decompress?

I want to use async/await , since I simply cannot put all the rest of the code in the stream.on('finish', () => {}); block, since there are too much code and they all rely on the stream being finished.

I have searched for related answers (there are many of them), but I still cannot get it working.

CodePudding user response:

Well, streams are event driven. You can put the code you want to run after the stream is done inside the finish event handler:

  const s3Item = await S3.getFile(objectBucket, objectKey);
  let stream = s3Item.Body.pipe(fs.createWriteStream(sourceFilePath, { mode: 0o777 }));
  stream.on('finish', async () => {
    logger.info(` pipe done `);  
    logger.info(`start decompressing...`);
    await decompressArchive(sourceFilePath, targetDirectory_decompressed);
  });

Or, you can wrap the finish event in a promise and await that:

  const s3Item = await S3.getFile(objectBucket, objectKey);
  let stream = s3Item.Body.pipe(fs.createWriteStream(sourceFilePath, { mode: 0o777 }));

  await new Promise((resolve, reject) => {
    stream.on('finish', () => {
      logger.info(` pipe done `);
      resolve();  
    }).on('error', err => {
      reject(err);
    });
  });

  logger.info(`start decompressing...`);
  await decompressArchive(sourceFilePath, targetDirectory_decompressed);

FYI, recent versions of nodejs have a once() function in the events module that makes this a bit easier:

  const { once } = require('events');

  const s3Item = await S3.getFile(objectBucket, objectKey);
  let stream = s3Item.Body.pipe(fs.createWriteStream(sourceFilePath, { mode: 0o777 }));

  await once(stream, 'finish');
  logger.info(` pipe done `);

  logger.info(`start decompressing...`);
  await decompressArchive(sourceFilePath, targetDirectory_decompressed);

Or, you can use a promisified version of pipeline() instead of .pipe(). Lots of ways to do this.

  • Related