starting slurm array job with a specified number of nodes
问题 I’m trying to align 168 sequence files on our HPC using slurm version 14.03.0. I’m only allowed to use a maximum of 9 compute nodes at once to keep some nodes open for other people. I changed the file names so I could use the array function in sbatch. The sequence files look like this: Sequence1.fastq.gz, Sequence2.fastq.gz, … Sequence168.fastq.gz I can’t seem to figure out how to tell it to run all 168 files, 9 at a time. I can get it to run all 168 files, but it uses all the available nodes