S3df management doesn't like it when heavy compute jobs are run on the interactive cluster (iana). A solution to this is to run jupyter lab on batch compute on the milano cluster.
Via Open OnDemand:
(no port forwarding needed)
- login to open ondemand at https://s3df.slac.stanford.edu/ondemand
- click on the top menu bar and choose interactive apps → Jupyter
- Select Custom for Jupyter Image
- Select Conda Environment
- Set the path to the conda env setup to run the jupyter notebook. (conda info | grep "active env location" can show the path to the current active conda env)
(an example of setting up a conda env for the first time for jupyter lab is here in the lume impact jupyter example) - change <environment> to be the desired conda env name
- Click use JupyterLab instead of Jupyter Notebook
- For run on cluster type select: Batch; s3df; account: FACET; Partition Milano
- Select number of hours and cores
- Configure rest of settings
- Launch and connect
Via Custom Port Forwarding:
- follow the lume impact example here to setup a conda env for the jupyter notebook. Install the appropriate packages needed. Lume impact is just an example.
ssh -L 5555:localhost:5555 iana srun --partition milano --account FACET -n 100 --time=01:00:00 --pty /bin/bash conda activate <CONDA ENV> && module load matlab && jupyter lab --no-browser --port=5555
- ssh into an interactive compute cluster like iana
- use srun to allocate an interactive session with the milano cluster
- load the appropriate conda env and start jupyter
- make sure you have port forwarding setup for port 5555 working if you cant connect to jupyter lab.
more documentation on using srun from the terminal is here