Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
habrok:job_management:running_jobs [2020/09/23 11:41] – external edit 127.0.0.1habrok:job_management:running_jobs [2023/03/23 15:57] (current) – [Job environment] admin
Line 13: Line 13:
  
 <code> <code>
-sbatch --partition=short jobscript+sbatch --partition=gpu jobscript
 </code> </code>
-Note that if you have options defined in your job script and as sbatch command-line arguments, the latter ones will override the values in the job script. So the last example will always send the job to the “short” partition, regardless of what you may have defined in the job script itself. For more detailed (and complete) examples, please look at the [[..:examples:start|Examples/templates]] section.+Note that if you have options defined in your job script and as sbatch command-line arguments, the latter ones will override the values in the job script. So the last example will always send the job to the “gpu” partition, regardless of what you may have defined in the job script itself. For more detailed (and complete) examples, please look at the [[..:examples:start|Examples/templates]] section.
  
 ===== Job environment ===== ===== Job environment =====
  
-Jobs will always start in the same directory as from which they were submitted. Furthermore, your complete environment during job submission will be stored and transferred to the job; this includes all loaded modulesThe latter can be modified by using the %%--%%export option in your job script:+Jobs will always start in the same directory as from which they were submitted. Do note that your environment (e.g. loaded modules) will **not** be transferred to the job, as was the case on the Peregrine clusterThis means that you should always load the required modules in your job script.
  
 +If you do really want to change the latter behavior (which we really do not recommend, as it breaks the reproducibility of your scripts), you can add the following to your job script:
 <code> <code>
-#SBATCH --export=var1,var2,...,varN+#SBATCH --export=ALL
 </code> </code>
-It takes a list of names of environment variables that should be transferred to the job. Special values that can be given are ALL (default) and NONE. By using the latter, the job will start with a clean/empty environment.