Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The executable script used in the workflow definition should be used primarily to set up the environment etc and submit the analysis script to the HPC workload management infrastructure. For example, a simple executable script that uses LSFSLURM's bsub sbatch to submit the analysis script to the psdebugq queue is available here - /reg/g/psdm/tutorials/batchprocessing/arp_submit.sh

Code Block
#!/bin/bash


source /reg/g/psdm/etc/psconda.sh
ABS_PATH=/reg/g/psdm/tutorials/batchprocessing
bsub
sbatch -q psdebugq -o "logs/%J.log" python-nodes=2 --time=5 $ABS_PATH/arp_actual.py "$@"

This script will submit /reg/g/psdm/tutorials/batchprocessing/arp_actual.py  on psdebugq and store   and store the log files in /reg/d/psdm/dia/diadaq13/scratch/logs/<lsf<slurm_job_id>.out.  /reg/g/psdm/tutorials/batchprocessing/arp_actual.py will be passed the parameters as command line arguments and will inherit the EXPERIMENT, RUN_NUM and JID_UPDATE_COUNTERS environment variables.

...