You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Presentations and useful meetings

SDF preliminaries

An environment needs to be created to ensure all packages are available. We have explored some options for doing this.

Option 1: stealing instance built for SSI 2023. This installs most useful packages but uses python 3.6, which leads to issues with h5py.
Starting Jupyter sessions via SDF web interface

  1. SDF web interface > My Interactive Sessions > Services > Jupyter (starts a server via SLURM)
  2. Jupyter Instance > slac-ml/SSAI

Option 2: create your own conda environment. Follow the SDF docs to use ATLAS group installation of conda.
There is also a slight hiccup with permissions in the folder /sdf/group/atlas/sw/conda/pkgs, which one can sidestep by specifying their own folder for saving packages (in GPFS data space).
The TLDR is:

export PATH="/sdf/group/atlas/sw/conda/bin:$PATH" 
conda init # the previous will be added to your bashrc file
# Add the following lines to ~/.condarc file (create default file with conda config)
pkgs_dirs:
  - /gpfs/slac/atlas/fs1/d/<user>/conda_envs

conda env create -f bjr_v01.yaml # for example(bjr_v01) conda install jupyter

This env can be activated when starting a kernel in Jupyter by adding the following under Custom Conda Environment:

export CONDA_PREFIX=/sdf/group/atlas/sw/conda
export PATH=${CONDA_PREFIX}/bin/:$PATH
source ${CONDA_PREFIX}/etc/profile.d/conda.sh
conda env list
conda activate bjr_v01


Producing H5 samples

We are using a custom fork of dataset-dumper, developed for producing h5 files for NN training based on FTAG derivations. The custom fork is modified to store the truth jet pT via AntiKt4TruthDressedWZJets container.

  • Add documentation about training dataset dumper fork

The current set of available Ntuples is available on:

/gpfs/slac/atlas/fs1/d/pbhattar/BjetRegression/Input_Ftag_Ntuples
├── Rel22_ttbar_AllHadronic
├── Rel22_ttbar_DiLep
└── Rel22_ttbar_SingleLep

Analyzing H5 samples

Notebooks

  • Chunking h5 files: /sdf/home/b/bbullard/bjes/analysis/ChunkH5.ipynb

Miscellaneous tips

You can grant read/write access for GPFS data folder directories to ATLAS group members via the following (note that this does not work for SDF home folder)

groups <username> # To check user groups
cd <your_directory> 
find . -type d|xargs chmod g+rx # Need to make all subdirectories readable *and* executable to the group
  • No labels