Home > Mobile >  How to reduce file size of millions of images?
How to reduce file size of millions of images?

Time:11-13

I have a few million images stored as jpgs. I'd like to reduce the size of each jpg by 80%. Here's a bash loop that I'm currently using (I'm on MacOS):

for i in *jpg; do convert "$i" -quality 80% "${i%.jpg}.jpg"; done; 

The above line converts the images sequentially. Is there a way to parallelize and thus speed up this conversion? I don't need to use bash, just want to find the fastest way to make the conversion.

CodePudding user response:

Using Python you can do it this way:

import glob
import shlex
import subprocess
from tqdm.contrib.concurrent import thread_map

def reduce_file(filepath):
    output = f"{filepath}_reduced.jpg"
    cmd = f"convert {filepath} -quality 80% {output}"
    subprocess.run(shlex.split(cmd))

list(thread_map(reduce_file, glob.glob("./images/*.jpg")))

Given that your images are in images/*.jpg.

CodePudding user response:

With GNU xargs. This will run 10 convert processes simultaneously and restart more processes if less than 10 are running simultaneously until 10 are running simultaneously again.

printf "%s\n" *.jpg | xargs -n 1 -P 10 -I {} convert {} -quality 80% {}

xargs replaces all {} in convert command with the file name that comes from stdin.

I assume that your file names do not contain a line break. The original files are overwritten.

  • Related