This page describes smalldata_tools, a suite of code useful for analysis from the xtc data to small(er) hdf5 files at several stages of analysis. The code can be found on git at https://github.com/slac-lcls/smalldata_tools.
At XPP or XCS, the code setup is usually taken care off by the beam line staff. For other hutches, please contact the controls POC or pcds-poc-l. The suggested working directory are:
/cds/data/psdm/<hutch>/<expname>/results/smalldata_tools
for the offline and
/cds/data/drpsrcf/<hutch>/<expname>/scratch/smalldata_tools
for the (new) fast feedback system (psffb).
Online and offline analysis
Two analysis infrastructures comprising of various queues and interactive nodes, are available to use depending on the status of the experiment.
Online analysis
Ongoing experiment are generally using the online analysis infrastructure, the fast feedback system (ffb). More info on the system here: Fast Feedback System
This system is faster and provides prioritization to ongoing experiments. Some time after the experiment is over, the access to the data will be locked and only the offline system will be available.
Offline analysis
After the experiment is over, the data and smalldata production code are moved to the offline system, the anafs. This system available for analysis indefinitely and can be used to reprocess or refine the data.
How do I access the computing resources?
ssh -X <ACCOUNT>@pslogin.slac.stanford.edu
If using NoMachine, login to
psnxserv.slac.stanford.edu
Then:
ssh -X psana source /reg/g/psdm/etc/psconda.sh # Environment to use psana, etc
or for the online analysis:
ssh -X psffb source /reg/g/psdm/etc/psconda.sh # Environment to use psana, etc
Workflow
The analysis is generally split in two steps, allowing for easy diagnostics and customization of the analysis process. Please contact your controls and data POC to assess the best approach for your experiment.
- The first step is the generation of the "small data" file, the colloquial name for run-based hdf5 files which contain different data arrays where the first dimension is the number of events. This production can be run automatically on each new run so that the data is available only a few minutes after the run has ended. It can also be run on request in case you want to tweak the data extraction. Pre-processing of the area detector can be configured at this stage, performing operation such as extracting a region of interest, azimuthal integration, photon counting, etc.
The following pages describe this in more detail: - The second stage depends much more on the type of experiment. Different options are available:
- Adapt one of the templated analysis notebooks to suit the current experiment needs. These custom templates have been made for the more common experiments performed at different endstations at LCLS and are available at
/reg/g/psdm/sw/tools/smalldata_tools/example_notebooks
. This approach is generally recommended for lightweight data analysis, for which the area detector images are reduced to a single (or few) number (integration of a ROI, azimuthal binning, for example). Please refrain from modifying these released notebooks in place.
A description of the example notebooks can be found here: Example notebooks - Binning of the full detector images can be performed by setting up the cube analysis, which will return a dataset of binned data and images, resulting in a relatively light-weight file. This approach is recommended in cases where the analysis of the full image is needed (Q-resolved diffused scattering analysis, for example). This is also the approach recommended for users with little python experience.
Details on the cube workflow are given here: Cube production
- Adapt one of the templated analysis notebooks to suit the current experiment needs. These custom templates have been made for the more common experiments performed at different endstations at LCLS and are available at
The contents of the smallData files are described here
Access analysis codes and folder from Jupyter lab
From Jupyter hub, click on the "+" symbol on the top left. Select "terminal" and make a soft-link to the experiment folder:
ln -s /cds/data/psdm/<hutch>/<experiment>/ ./<link>
If the experiment is going to make use of the FFB, make a second soft-link:
ln -s /cds/data/drpsrcf/<hutch>/<experiment>/ ./<link>