I want to use a pipe with mkfifo
to capture all output from several processes running concurrently, roughly like the following
PIPENAME="/tmp/foo"
echo "Make ${PIPENAME:?}" && mkfifo "${PIPENAME:?}"
echo "hello 1" >"${PIPENAME:?}" &
echo "hello 2" >"${PIPENAME:?}" &
wait < <(jobs -p)
cat "${PIPENAME:?}" | sort | uniq
echo "Removing FIFO ${PIPENAME:?}" && rm "${PIPENAME:?}"
However, the above hangs and does not work, that is, none of the echo "hello..." >"${PIPENAME}"
processes terminate so the script never gets to the wait line.
I guess a process writing to the pipe cannot terminate until its output was consumed from the other end? If yes, what's the canonical way to achieve my goal here? Again, I want to be able to start a few (less than 20) possibly long-running commands in parallel and have their outputs combined without loosing any lines.
CodePudding user response:
Unless you have some other reason for using a fifo, I think that what you are trying to do can be done with:
{
echo 'hello 1' &
echo 'hello 2' &
wait
} | sort -u
CodePudding user response:
Since a FIFO
is not a regular file but just the reference to a (named) pipe a process can write to it only after it was opened on both ends, see fifo(7)
.
So your echo
commands won't even start to write their output to the pipe but instead will block until another process is attached to the pipe for reading the output.
what's the canonical way to achieve my goal here?
Depends greatly what your goal here is:
If you just want to combine the output of multiple commands before piping it to another command, use the group command syntax from pjh's answer.
If you have to use a FIFO
attach the reading process before wait
ing on the writing commands:
echo "hello 1" >"${PIPENAME:?}" &
echo "hello 2" >"${PIPENAME:?}" &
cat "${PIPENAME:?}" | sort | uniq &
wait < <(jobs -p)
If you just want to collect the output for later processing just use a regular file instead:
LOGFILE="/tmp/foo"
echo "hello 1" >> "$LOGFILE" &
echo "hello 2" >> "$LOGFILE" &
wait < <(jobs -p)
# anytime later
cat "$LOGFILE" | sort | uniq
But be aware that with all of this solutions very long lines or unfortunate scheduling of the processes can lead to the output of one process to be placed in the middle of a line of another.
To prevent this you may need to use a proper logging tool.