Say I want to run a job on the cluster: job1.m
Slurm handles the batch jobs and I\'m loading Mathematica to save the output file job1.csv
I submit job1.m and
I am assuming job1.m
is a Mathematica job, run from inside a Bash submission script. In that case, job1.m
is read when the job starts so if it is modified after submission but before job start, the modified version will run. If it is modified after the job starts, the original version will run.
If job1.m
is the submission script itself (so you run sbatch job1.m
), that script is copied in a spool directory specific to the job so if it is modified after the job is submitted, it still will run the original version.
In any case, it is better, for reproducibility and traceability, to make use of a workflow manager such as Fireworks, or Bosco