Home > Mobile >  Running multiple similar job files (sbatch)
Running multiple similar job files (sbatch)

Time:12-19

I am trying to run multiple (several hundred) very similar job-files with slurm using sbatch.

My .job file look like:

#SBATCH ...
...
...
srun ./someProg -a A -b B -c C -d D

Is there any convenient way to submit the job file using sbatch with multiple options for A/B/C and D and generating a new job for every combination of A/B/C/D, without just generating hundreds of .job files? (I have already seen a lot of arrays in slurm files, but I do not think it will help me here anyway.)

CodePudding user response:

without just generating hundreds of .job files?

You can use bash's Process Subsitution for replacing the creation of files:

#!/bin/bash

genjob() {
    local content
    IFS='' read -d '' -r content <<-EOF
        #!/bin/bash
        #SBATCH ...
        ...
        ...
        srun ./someProg $(printf '%q ' "$@")
    EOF
    printf '%s\n' "$content"
}

sbatch <(genjob -a A -b B -c C -d D)

important: The dash in <<-EOF means that the TAB characters at the beginning of each line of the Heredoc will be stripped; so the indentation must be done with TABs.

CodePudding user response:

You say you have several hundred jobs like that. That may be more than your available core count, so you want to be careful how you submit it. You want to submit as many as possible, but not all of them at once.

Here are two utilities that accept an arbitrary long list of commandlines, and then spread them over the available nodes/cores:

https://github.com/TACC/launcher

https://github.com/TACC/pylauncher

  • Related