Is it possible to write from multiple processes to some queue in atomic manner? For example
#!/bin/bash
rm ./queue
mkfifo ./queue
curl http://www.url1.com > ./queue &
curl http://www.url2.com > ./queue &
cat ./queue
Output order www.url1.com or www.url2.com does not matter. I would like to have consistence content regardless of size. Is it possible in linux shell? Named fifo is not obligatory.
CodePudding user response:
Just with GNU parallel:
parallel --group curl ::: http://www.url1.com http://www.url2.com
Without GNU parallel you would lock for the output:
lockfile=$0
func() {
a=$(curl "$1")
flock "$lockfile" cat <<<"$a"
}
func http://www.url1.com &
func http://www.url2.com
wait
# with GNU xargs
export -f func
printf "%s\n" http://www.url1.com http://www.url2.com |
xargs -P0 bash -c 'func "$@"' _