Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • The first step is the generation of the "small data" file, the colloquial name for run-based hdf5 files which contain different data arrays where the first dimension is the number of events (shot-to-shot information retained). This production can be run automatically on each new run so that the data is available only a few minutes after the run has ended. It can also be run on request in case you want to tweak the data extraction. Pre-processing Processing of the area detector can be configured at this stage, performing operation such as extracting a region of interest, azimuthal integration, photon counting, etc. It is non-recommended to save full large area detector data at this step.
    The following pages describe this in more details:

    Generation of small hdf5 files

    Configuration of SmallData

    Adding Data from area detectors

  • The second stage depends much more on the type of experiment. Different options are available:
    • Binning of the full detector images can be performed by setting up the cube analysis, which will return a dataset h5 file of binned data and images, resulting in a relatively light-weight file. While the shot-to-shot information is lost at this point, this approach is generally recommended, as it is more carefree and does not require to delve into the details of the binning procedure. It is also almost mandatory in cases where the analysis of the full image is needed (Q-resolved diffused scattering analysis, for example). Note that the shot-to-shot information remains readily available from the file produced in the first step (without the area detector data).
      Details on the cube workflow are given here: Cube production 
    • Adapt one of the templated analysis notebooks to suit the current experiment needs. These custom templates have been made for the more common experiments performed at different endstations at LCLS and are available at /reg/g/psdm/sw/tools/smalldata_tools/example_notebooks (please refrain from modifying these released notebooks in place). This approach works well for lightweight data analysis, for which the area detector images are reduced to a single (or few) number (integration of a ROI, azimuthal binning, for example) in the first step. It is also suited when detailed shot-to-shot information needs to be examined, and the users want full control on over the data binning process is desired.
      Documentation on the example notebooks can be found here: Example notebooks.

...