This section details how to set up custom binning of the full detector images and save the output to an hdf5 file.

The main cube script is ./producers/letsCube.py, but this file should generally not be modified. The configuration is done through a hutch-specific file of the type ./producers/cube_config_<hutch>.py.

Config file

The configuration consists of the following main sections (see example below too):

Example config file for XPP:

import numpy as np

# custom bins
def binBoundaries(run):
    if isinstance(run,str):
        run=int(run)
    if run>0:
        return np.arange(-5.,50.,0.2)
    return None


# filters to apply to the data
# format: list of [det (field), low, high, name]
# 'filter1' is the standard name and will not be added to the h5 file name.
filters = [
    ['lightStatus/xray',0.5,1.5,'filter1'],
    ['ipm2/sum',3e2,6e4,'filter1'],
    ['evr/code_41',0.5,1.5,'custom']
]

# Laser on/off.
laser = True

# List detectors to be cubed. Area detector have additional options such as threshold
# For now only full image or calib works. TODO: Add photon maps. And then any detObjectFunc
# Detectors should then be added to varList
detDict = {'source':'jungfrau1M',
           'full':1,
           'image':0,
           'thresADU':6.5,
           'common_mode':0}

varList = ['ipm2/sum','ipm3/sum','diodeU/channels', detDict]


# histogram configuration. Usually does not need to be changed
# field: destination in smd, list: [low,high,n] or None (then default to some percentile)
hist_list = {
    'ipm2/sum': [0, 5e4, 70],
    'ipm3/sum': [0, 5e3, 70],
    'tt/FLTPOS_PS': [-0.5, 0.5, 70],
    'tt/AMPL': [0, 0.2, 70],
    'tt/FLTPOSFWHM': [0, 300, 70]
}


# save as tiff file (ignore unless MEC)
save_tiff = False

Cube file content

The cubed data are saved in a h5 files with the following naming convention:

Cube_<exp>_<run#>_<bin_axis>_<laser_status>_<filter_name>.h5

When using laser=False or filter1 as the filter name, the last two bits will be omitted.

The datasets are the following:

An example notebook that load and plots a cube file can found at: /cds/group/psdm/sw/tools/smalldata_tools/example_notebooks/cube.ipynb

Please don't modify this file directly, copy it to your home or experiment area to test and explore.

Debug the cube production


Job is done but the report does not show "Cube: Done"

This means the job failed somewhere. Open the log files and look if one of the following cases apply:

  1. The most common case is that there is a typo or syntax in the config file. In this case, the log file should be very short, directly pointing at the section in the config file that has issue. As often with typo or syntax errors in python, the error is not always at the exact line mention in the error.
  2. Check that the filters do not filter out all the pulses. This can easily spotted by looking for the following section at the end of the log file:

    did not select any event, quit now!
    getFilter: Cut 0.500000 < evr/code_94 < 1.500000 passes 0 events of 11252, total passes up to now: 0 
    getFilter: Cut 24.464119 < scan/diag_x < 24.496000 passes 11252 events of 11252, total passes up to now: 0
    getFilter: Cut 0.500000 < damage/jungfrau1M < 1.500000 passes 11252 events of 11252, total passes up to now: 0

Analyze cube results

An example notebook of a cube file analysis can be found in /cds/group/psdm/sw/tools/smalldata_tools/example_notebooks/cube.ipynb.