...
Code Block |
---|
# modulefiles relies on an ENV MODULEPATH $ export MODULEPATH=/usr/share/Modules/modulefiles:/etc/modulefiles:/afs/slac/package/singularity/modulefiles:/opt/modulefiles # list available modulesfiles $ module avail ------------------------------------------ /afs/slac/package/singularity/modulefiles ------------------------------------------ amira/6.7.0 eman2/20190418 motioncor2/1.2.2 relion/3.0 slac-ml/20181002.0 cdms-jupyterlab/1.6 eman2/20190603 motioncor2/1.2.3-intpix relion/3.0.2 slac-ml/20190606.1 chimera/1.13.1 emClarity/1.0.0 phenix/1.14-3260 relion/3.0.4 slac-ml-devel/20181002.0 ctffind/4.1.10 git/2.13.0 protomo/2.4.2 relion/3.0_beta-20181121 xds/20190315 ctffind/4.1.13 icon-gpu/1.2.9 pymol/2.1.1 resmap/1.95 eman2/20181015 imod/4.9.10 pymol/2.2 rosetta/2018.48 eman2/20190320 imod/4.9.11 rclone/1.44 rosetta/3.10 eman2/20190324 motioncor2/1.2.1 relion/2.1 scipion/1.2.1 ------------------------------------------------------ /opt/modulefiles ------------------------------------------------------- boost/1.69.0-openmpi-3.1.2-gcc-4.8.5 lsf boost/1.69.0-openmpi-3.1.2-gcc-7.3.1 openmpi/3.1.2-gcc-4.8.5 cuda/10.0(default) openmpi/3.1.2-gcc-7.3.1 cuda/9.0 openmpi/3.1.3-gcc-4.8.5 cuda/9.2 openmpi/3.1.3-gcc-7.3.1 fftw3/3.3.8-openmpi-3.1.2-gcc-4.8.5 parallel-hdf5/1.10.4-openmpi-3.1.2-gcc-4.8.5 fftw3/3.3.8-openmpi-3.1.2-gcc-7.3.1 parallel-hdf5/1.10.4-openmpi-3.1.2-gcc-7.3.1 gcc/4.8.5 parallel-hdf5/1.10.4-openmpi-3.1.3-gcc-4.8.5 gcc/6.3.1 parallel-hdf5/1.10.4-openmpi-3.1.3-gcc-7.3.1 gcc/7.3.1(default) PrgEnv-gcc/4.8.5(default) intel/2019.4.227(default) PrgEnv-gcc/7.3.1 intelmpi/2019.4.227(default) python/anaconda3(default)slac-ml/20190606.1 ... |
There is a specific module called slac-ml
that provides a prebaked Singularity image derived from the jupyterhub image.
...
In order to run on batch, we will use this modulefile system and submit jobs by creating a text file (eg myscript.sh
) like
Code Block |
---|
#!/bin/bash -l #BSUB -P jupyter #BSUB -J my_batch_job_name #BSUB -q slacgpu #BSUB -n 1 #BSUB -R "span[hosts=1]" #BSUB -W 72:00 #BSUB -B # setup env source /etc/profile.d/modules.sh export MODULEPATH=/usr/share/Modules/modulefiles:/opt/modulefiles:/afs/slac/package/singularity/modulefiles module purge module load PrgEnv-gcc module slac-ml/20190606.1 # run the notebook, executing all cells cd MY_DATA_DIRECTORY jupyter nbconvert --to notebook --inplace --execute mynotebook.ipynb |
You can then submit the job to batch via
Code Block |
---|
bsub < myscript.sh |
and you can monitor the job with
Code Block |
---|
bjob -l {jobid} |
We recommend either using nbconvert or papermill to provide parameter access to notebooks.