I have a text file that has a url and the desired file name on each line (seperated by a space). I am looping through this text file and downloading the url and saving it as the desired name using wget:
while IFS=' ' read a b
do wget $a -O $b
done < list.txt
The problem is my list contains almost 9000 files so downloading them one by one will take a long time. Is there anyway I can download them simultaneously?
CodePudding user response:
Try with:
while IFS=' ' read a b; do
wget $a -O $b &
done < list.txt
# Wait for background jobs to terminate...
wait
PS: Keep in mind that 9000 concurrent connections maybe not handled by the server, so I suggest you to limit the connection number in chunks.
CodePudding user response:
You can do something like :
number=10
while IFS=' ' read a b; do
if test $(jobs | grep Running | wc -l) -ge $number; then
sleep 1
else
echo "Starting $a..."
wget $a -O $b &
fi
done < list.txt
wait