I'm trying to download multiple URLs using curls:
user@PC:~$ curl -LOJ "https://example.com/foo.jpg" "https://example.com/bar.jpg"
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 445 100 445 0 0 517 0 --:--:-- --:--:-- --:--:-- 518
<?xml version="1.0" encoding="iso-8859-1"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<title>404 - Not Found</title>
</head>
<body>
<h1>404 - Not Found</h1>
<script type="text/javascript" src="//wpc.75674.betacdn.net/0075674/www/ec_tpm_bcon.js"></script>
</body>
</html>
user@PC:~$ ls foo.jpg
foo.jpg
user@PC:~$ ls bar.jpg
ls: cannot access 'bar.jpg': No such file or directory
But it only applied the arguments (-LOJ
) to the first URL, so only the first file gets downloaded.
If I repeat the arguments for each URL, this issue no longer occurs and both files get downloaded:
user@PC:~$ curl -LOJ "https://example.com/foo.jpg" -LOJ "https://example.com/bar.jpg"
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 445 100 445 0 0 481 0 --:--:-- --:--:-- --:--:-- 481
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 445 100 445 0 0 1534 0 --:--:-- --:--:-- --:--:-- 1539
user@PC:~$ ls foo.jpg
foo.jpg
user@PC:~$ ls bar.jpg
bar.jpg
So is there a way to have the arguments apply to all the URLs passed to curl without having to repeat it for each URL?
CodePudding user response:
You need to use -n1
, like
{ echo "https://example.com/foo.jpg"; echo "https://example.com/bar.jpg"; } | xargs -n1 curl -LOJ
to tell xargs to start a curl for each url, not run a single curl with both urls as arguments.
CodePudding user response:
This error suggests the jpgs you are trying to retrieve do not exist (or at least not at the endpoint you are hitting).
Have you confirmed they all are actually there? Where are you getting your list of URLs from?