S3df management doesn't like it when heavy compute jobs are run on the interactive cluster (iana). A solution to this is to run jupyter lab on batch compute on the milano cluster.


Via Open OnDemand:

(no port forwarding needed)

  1. login to open ondemand at https://s3df.slac.stanford.edu/ondemand
  2. click on the top menu bar and choose interactive apps → Jupyter
  3. Select Custom for Jupyter Image
  4. Select Conda Environment
  5. Set the path to the conda env setup to run the jupyter notebook. (conda info | grep "active env location"  can show the path to the current active conda env)
     (an example of setting up a conda env for the first time for jupyter lab is here in the lume impact jupyter example)
  6. change <environment> to be the desired conda env name
  7. Click use JupyterLab instead of Jupyter Notebook
  8. For run on cluster type select: Batch; s3df; account: FACET; Partition Milano
  9. Select number of hours and cores 
  10. Configure rest of settings
  11. Launch and connect

Via Custom Port Forwarding:

  1. follow the lume impact example here to setup a conda env for the jupyter notebook. Install the appropriate packages needed. Lume impact is just an example.
ssh -L 5555:localhost:5555 iana

srun --partition milano --account FACET -n 100 --time=01:00:00 --pty /bin/bash



conda activate <CONDA ENV> && module load matlab && jupyter lab --no-browser --port=5555

  1. ssh into an interactive compute cluster like iana
  2. use srun to allocate an interactive session with the milano cluster
  3. load the appropriate conda env and start jupyter
  4. make sure you have port forwarding setup for port 5555 working if you cant connect to jupyter lab.

more documentation on using srun from the terminal is here

  • No labels