This section details how to set up custom binning of the full detector images and save the output to an hdf5 file.

The main cube script is ./producers/letsCube.py, but this file should generally not be modified. The configuration is done through a hutch-specific file of the type ./producers/cube_config_<hutch>.py.

Config file

The configuration consists of the following main sections (see example below too):

Example config file for XPP:

import numpy as np

# custom bins
def binBoundaries(run):
    if isinstance(run,str):
        run=int(run)
    if run>0:
        return np.arange(-5.,50.,0.2)
    return None


# filters to apply to the data
# format: list of [det (field), low, high, name]
# 'filter1' is the standard name and will not be added to the h5 file name.
filters = [
    ['lightStatus/xray',0.5,1.5,'filter1'],
    ['ipm2/sum',3e2,6e4,'filter1'],
    ['evr/code_41',0.5,1.5,'custom']
]

# Laser on/off.
laser = True

# List detectors to be cubed. Area detector have additional options such as threshold
# For now only full image or calib works. TODO: Add photon maps. And then any detObjectFunc
# Detectors should then be added to varList
detDict = {'source':'jungfrau1M',
           'full':1,
           'image':0,
           'thresADU':6.5,
           'common_mode':0}

varList = ['ipm2/sum','ipm3/sum','diodeU/channels', detDict]


# histogram configuration. Usually does not need to be changed
# field: destination in smd, list: [low,high,n] or None (then default to some percentile)
hist_list = {
    'ipm2/sum': [0, 5e4, 70],
    'ipm3/sum': [0, 5e3, 70],
    'tt/FLTPOS_PS': [-0.5, 0.5, 70],
    'tt/AMPL': [0, 0.2, 70],
    'tt/FLTPOSFWHM': [0, 300, 70]
}


# save as tiff file (ignore unless MEC)
save_tiff = False

Cube file content

The cubed data are saved in a h5 files, whose fields are the following:

An example notebook that load and plots a cube file can found at: /cds/group/psdm/sw/tools/smalldata_tools/example_notebooks/cube.ipynb

Please don't modify this file, copy it to your home to test and explore.

Debug the cube production

The cube job can sometimes fail or get stuck. Unfortunately, the code is still in a state such that a failure won't always result in the code exiting, but instead just hanging.

Here we'll go over specific cases and how to identify them by looking at the log files. In general log

The job seems stuck

  1. Check that the filters do not filter out all the pulses. In the current state, if all shots are being filtered, the job will just hang forever. This can easily spotted by looking for the following section in the log file:

    did not select any event, quit now!
    getFilter: Cut 0.500000 < evr/code_94 < 1.500000 passes 0 events of 11252, total passes up to now: 0 
    getFilter: Cut 24.464119 < scan/diag_x < 24.496000 passes 11252 events of 11252, total passes up to now: 0
    getFilter: Cut 0.500000 < damage/jungfrau1M < 1.500000 passes 11252 events of 11252, total passes up to now: 0


Notes on the multi-dimensional cube

This feature is still in development, but is fully functional. Clean-up and quality of life improvements are to be expected though.

A second (or more) bin-axis can be selected following a similar run-dependent logic as for the main bin axis

Example:

def get_addBinVars(run):
    if isinstance(run,str):
        run=int(run)
    addBinVars = None
    if run==128:
        addBinVars = {'ipm2/sum': np.linspace(0,4e4,4)}
    return addBinVars

The additional axis as passed as a dictionary as {'<variable>': <bin_array>}

Cubed file content

The bin variables are handled slightly differently than in the regular cube. In the future, more uniformity can be expected.

The cubed variables now have a shape matching that of the bin-axis (i.e. (size(bin_axis1), size(bin_axis2), ...)).

The bin-axis are being saved as 1-dimensional array named bin_<axis-name>.