Multiple processes atomic output in linux shell

Issue

Is it possible to write from multiple processes to some queue in atomic manner?
For example

#!/bin/bash                                                                                                                   

rm ./queue
mkfifo ./queue
curl http://www.url1.com > ./queue &
curl http://www.url2.com > ./queue &
cat ./queue

Output order www.url1.com or www.url2.com does not matter. I would like to have consistence content regardless of size. Is it possible in linux shell? Named fifo is not obligatory.

Solution

Just with GNU parallel:

parallel --group curl ::: http://www.url1.com http://www.url2.com

Without GNU parallel you would lock for the output:

lockfile=$0
func() {
   a=$(curl "$1")
   flock "$lockfile" cat <<<"$a"
}
func http://www.url1.com &
func http://www.url2.com
wait

# with GNU xargs
export -f func
printf "%s\n" http://www.url1.com http://www.url2.com |
  xargs -P0 bash -c 'func "[email protected]"' _

Answered By – KamilCuk

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published