Home > Software design >  Windows batch file: download many small files fast
Windows batch file: download many small files fast

Time:08-13

I have a list of urls (urls.txt):

https://example.com/1.webp
https://example.org/bar2.webp
... 10k more

Files vary in size from 1kb to 100kb.

How can I download these files quickly on a Windows 10 machine without installing any third-party software?

I tried this:

powershell -Command "Invoke-WebRequest https://example.com/1.webp -OutFile 1.webp"

but it extremely slow due to sequential execution

CodePudding user response:

"...how do I iterate over a file lines with it? Sry, I never used Windows" (that must feel like me after a Linux machine).

Open a PowerShell prompt (Start → Run → PowerShell) or just type PowerShell.exe on the command prompt.

At the PowerShell prompt, to run the task in parallel using ForEach-Object -Parallel:

1..9 |ForEach-Object -Parallel { "Invoke-WebRequest https://example.com/$_.webp" -OutFile "$_.webp" }

Where "$_" is the current item (1to9`), you might also use a list here, like:

'One', 'Two', 'Three' |ForEach-Object -Parallel { ...

In case you "need to read it directly from the file", (presuming that you want use the name in the url as your filename) you might do something like this:

Get-Content .\urls.txt |ForEach-Object -Parallel {
    $FileName = Split-Path -leaf $_
    "Invoke-WebRequest $_ -OutFile $FileName
}

CodePudding user response:

You could try the foreach-object -parallel method for this case, i tried something simular once with multiple process starts for robocopy to get like 1000 small files (5-10kb) on another harddrive.

I will look up if i can find it again.

Edit 1: you can go over like this for example.

$allmylinks = import-csv -path "path to your csv"
foreach -parallel($link in $allmylinks){
    Invoke-WebRequest $link
}
  • Related