Home > Blockchain >  How to prevent kill signal propagation between piped streams of NodeJS child processes?
How to prevent kill signal propagation between piped streams of NodeJS child processes?

Time:01-25

Issue description

I have a child process spawned by NodeJS which output stream (stdout) needs to be connected to a second NodeJS child process input stream (stdin).

However, from time to time, the first process gets killed, in which case I want to restart that process and rewire its output stream to the same second process input, without having to restart the second process.

First try

I first tried to connect the stdout and stdin, which works fine until a kill signal is received by the first process:

const firstProc = cp.spawn('/some/proc/path', [/* args */])
const secondProc = cp.spawn('/ffmpeg/path', [/* args */])
firstProc.stdout.pipe(secondProc.stdin)

But as soon as the first process receives a kill signal, it gets propagated to the second process which terminates as well.

On the main NodeJS process, I'm able to intercept a SIGINT signal for example, but this does not seem to be available for child processes:

process.on('SIGINT', () => {
  /* do something upon SIGINT kill signal */
})

Question summary

So my question is: is it possible to intercept the kill signal on a child process before it gets transmitted to the second process, 'detach' the stream connection, start a new process and pipe its output to the input stream of the second process?

Additional Notes

I've tried to add a duplex transform stream between the stdout and stdin but that doesn't seem to resolve my problem as it closes as well when its input gets closed.

I thought about creating some kind of socket connection between the two processes but I've never done something like that and I'm a bit afraid of the added complexity.

If there is an easier way to handle my scenario, I'd be glad to know! Thanks for any idea!

CodePudding user response:

See https://nodejs.org/api/stream.html#readablepipedestination-options:

By default, stream.end() is called on the destination Writable stream when the source Readable stream emits 'end', so that the destination is no longer writable. To disable this default behavior, the end option can be passed as false, causing the destination stream to remain open

So you're looking for something like

const secondProc = cp.spawn('/ffmpeg/path', [/* args */]);
function writeForever() {
  const firstProc = cp.spawn('/some/proc/path', [/* args */])
  firstProc.stdout.pipe(secondProc.stdin, { end: false });
  firstProc.stdout.on('end', writeForever); // just spawn a new firstProc and continue…
}
writeForever();

CodePudding user response:

Since I do use the Angular framework, I am quite accustomed to RXJS, which makes this kind of streaming task very easy.

If you are manipulating a lot of streams, I would suggest using RXJS with RXJS-stream.

The resulting code would look like this:

import { concat, of } from 'rxjs';
import { rxToStream, streamToRx } from 'rxjs-stream';

const concatedStreams$ = concat([
  streamToRx(cp.spawn('/some/proc/path', [/* args */])),
  //of('End of first, start of second'), // Optional
  streamToRx(cp.spawn('/ffmpeg/path', [/* args */])),
]);

rxToStream(concatedStreams$).pipe(process.stdout);
  • Related