...
- SDF guide and documentation, particularly on using Jupyter notebooks interactively or through web interface (runs on top of nodes managed by SLURM)
- Training dataset dumper (used for producing h5 files from FTAG derivations) documentation and git (Prajita's fork, bjet_regression is the main branch)
- SALT documentation, SALT on SDF, puma git repo (used for plotting), and Umami docs (for postprocessing), also umami-preprocessing (UPP)
- SLAC GitLab group for plotting related code
Code Block export PATH="/sdf/group/atlas/sw/conda/bin:$PATH"
FTAG1 derivation definition (FTAG1.py)
Documents and notes
- GN1 June 2022 PUB note, nice slides and proceedings from A. Duperrin
- Jannicke's thesis (chapter 4 on b-jets)
...
atlas
has 2 CPU nodes (2x 128 cores) and 1 GPU node (4x A100); 3 TB memoryusatlas
has 4 CPU nodes (4x 128 cores) and 1 GPU node (4x A100, 5x GTX 1080. Ti, 20x RTX 2080 Ti); 1.5 TB memory
Environments
For sdf
An environment needs to be created to ensure all packages are available. We have explored some options for doing this.
Option 1: stealing instance built for SSI 2023. This installs most useful packages but uses python 3.6, which leads to issues with h5py.
Starting Jupyter sessions via SDF web interface
...
Code Block |
---|
export PATH="/sdf/group/atlas/sw/conda/bin:$PATH"
export TMPDIR="${SCRATCH}" |
For S3DF
Setting up the conda environment has steps similar to those of SDF. To use the atlas group conda, use following steps
1.) Add the following in your .bashrc
Code Block |
---|
export PATH="/sdf/group/atlas/sw/conda/bin:$PATH" |
2.) Add the following to your .condarc
Code Block |
---|
pkgs_dirs:
- /fs/ddn/sdf/group/atlas/d/pbhattar/conda_envs/pkgs
envs_dirs:
- /fs/ddn/sdf/group/atlas/d/pbhattar/conda_envs/envs
channels:
- conda-forge
- defaults
- anaconda
- pytorch
auto_activate_base: true |
3.) source conda.sh
Code Block |
---|
> source /sdf/group/atlas/sw/conda/etc/profile.d/conda.sh |
4.) cross check if conda is set correctly
Code Block |
---|
> which conda #following should appear if things went smoothly
conda ()
{
\local cmd="${1-__missing__}";
case "$cmd" in
activate | deactivate)
__conda_activate "$@"
;;
install | update | upgrade | remove | uninstall)
__conda_exe "$@" || \return;
__conda_reactivate
;;
*)
__conda_exe "$@"
;;
esac
} |
5.) Now follow the same steps as SDF to create and activate the conda environment
...
Producing H5 samples with Training Dataset Dumper
...