Home > Software engineering >  How to run several python files using bash script iteratively using only 7 CPUS?
How to run several python files using bash script iteratively using only 7 CPUS?

Time:08-03

On my machine, which has eight CPUs, I wish to run a number of Python files. I only want to use 7 CPUs concurrently. Is it possible to automate this process?

Currently, I am doing

pids0=""
for i in $(seq 1.5 0.05 1.8); do
   python3 $i &
   pids0="$pids0 $!"
done
wait $pids0


pids1=""
for i in $(seq 1.85 0.05 2.0); do
   python3 $i &
   pids1="$pids1 $!"
done
wait $pids1

Each loop is designed in way to submit only 7 scripts, thus I have to write several for-loops to perform 100 calculations. As you can see, my approach is incredibly inefficient because I have to wait for 7 submitted python scripts to finish running before I can proceed to next loop.

My goal is to write code that can submit the next task as soon as I have any free CPU available after finishing the previous one without writing several for-loops.

PS: I am using my office computer and I am not an administrator.

CodePudding user response:

You can use GNU Parallel:

parallel -j 7 python3 script.py ::: $(seq 1.5 0.05 2.0)

EDIT according to Mark Setchell's comment

  • Related