I would like parallel to read the (seq numbers) pipe, so I would like running something like that:
seq 2000 | parallel --max-args 0 --jobs 10 "{ read test; echo $test; }"
Would be equivalent to running:
echo 1
echo 2
echo 3
echo 4
...
echo 2000
But unfortunately, the pipe was not read by parallel, meaning that it was instead ran like:
echo
echo
echo
...
echo
And the output is empty.
Does anyone know how to make parallel read (seq numbers) pipe? Thanks.
CodePudding user response:
An alternative with GNU xargs
that does not require GNU parallel
:
seq 2000 | xargs -P 10 -I {} "echo" "hello world {}"
Output:
hello world 1 hello world 2 hello world 3 hello world 4 hello world 5 . . .
From man xargs
:
-P max-procs
: Run up tomax-procs
processes at a time; the default is 1. If max-procs is 0, xargs will run as many processes as possible at a time.
-I replace-str
: Replace occurrences ofreplace-str
in the initial-arguments with names read from standard input.
CodePudding user response:
Using xargs
instead of parallel while still using a shell (instead of starting up a new copy of the /bin/echo
executable per line to run) would look like:
seq 2000 | xargs -P 10 \
sh -c 'for arg in "$@"; do echo "hello world $arg"; done' _
This is likely to be faster than the existing answer by Cyrus, because starting executables takes time; even though starting a new copy of /bin/sh
takes longer than starting a copy of /bin/echo
, because this isn't using -I {}
, it's able to pass many arguments to each copy of /bin/sh
, thus amortizing that time cost over more numbers; and that way we're able to used the copy of echo
built into sh
, instead of the separate echo
executable.