Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This page describes smalldata_tools, a suite of code useful for analysis from the xtc data to small(er) hdf5 files at several stages of analysis. The code can be found on git at https://github.com/slac-lcls/smalldata_tools.

At XPP or XCS, the code setup is usually taken care off by the beam line staff. For other hutches, please contact the controls POC or pcds-poc-l. The suggested working directory isare:

/reg/d/psdm/<hutch>/<expname>/results/smalldata_tools tools 

for the offline and

/cds/data/drpsrcf/<hutch>/<expname>/scratch/smalldata_tools tools 

for the (new) fast feedback system (psffspsffb).Here, you will find three main scripts and 3 python files that can be adapted to the needs of your experiment. The first stage of the analysis is the production

Workflow

The analysis is generally split in two steps, allowing for easy diagnostics and customization of the analysis process. Please contact your controls and data POC to assess the best approach for your experiment.

  • The first step is the generation of the "small data" file, the colloquial name for run-based hdf5 files which contain different data arrays where the first dimension is the number of events. This production can be run automatically on each new run so that the data is available only a few minutes after the run has ended.

...

  • The

...

  • second stage depends much more on the type of experiment. Different options are available:
    • Adapt one of the templated analysis notebooks to suit the current experiment needs. These custom templates have been made for the more common experiment performed at different endstation at LCLS and are available at <path>. This approach is generally recommended for lightweight data analysis, for which the area detector images are reduced to a single (or few) number (integration of a ROI, azimuthal binning, for example).
      A description of the example notebooks can be found here: <link>
    • Binning of the full detector images can be performed by setting up the cube analysis, which will return a dataset of binned data and images, resulting in a relatively light-weight file. This approach is recommended in cases where the analysis of the full image is needed (Q-resolved diffused scattering analysis, for example).
      Details on the cube workflow are given here: <link>


The contents of the smallData files are described here

smallData Contents

smallData Contents

smalldata_tools also contains code to help with the analysis of these files and a streamlined production of "binned" data (or the "cube"). It can be run either in an ipython environment or in jupyter notebooks. "start" notebooks for the analysis will be provided and can be adjusted in advance for the needs of the upcoming experiment to lighten the load of the user.