Home > Mobile >  How to read from Node PassThrough stream after writing ends?
How to read from Node PassThrough stream after writing ends?

Time:08-09

How do I write to a Node Passthrough stream, then later read that data? When I try, the code hangs as though no data is sent. Here's a minimal example (in Typescript):

    const stream = new PassThrough();
    stream.write('Test chunk.');
    stream.end();

    // Later
    const chunks: Buffer[] = [];
    const output = await new Promise<Buffer>((resolve, reject) => {
        stream.on('data', (chunk) => { 
            chunks.push(Buffer.from(chunk));}
        );
        stream.on('error', (err) => reject(err));
        stream.on('end', () => {
            resolve(Buffer.concat(chunks));
        });
    });

Please note that I can't attach the event listeners before writing to the stream: I don't know at the time of writing how I'm going to be reading from it. My understanding of a Transform stream like PassThrough was that it "decoupled" the Readable from the Writable, so that you could access them asynchronously.

CodePudding user response:

Your code works for me, the promise resolves to a buffer containing "Test chunk.".

It will fail, however, if the readable side of the stream has already started emitting data when the stream.on('data', (chunk) => {...}) is executed. I could force such a behavior by enclosing the // Later part of your code in a setTimeout and inserting an additional

stream.on("data", () => {});

before that. This command will cause the stream to start emitting. Could that have happened in your case?

To be on the safe side, end the "early" part of your code with stream.pause() and begin the "later" part with stream.resume(), for example:

const output = await new Promise<Buffer>((resolve, reject) => {
  stream.resume();
  stream.on('data', (chunk) => {
...
  • Related