Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Mar. 24, 2017:  "Reasons For Re-Processing Data"

Link to slides:

drp-users-mar-24-2017.pptx

...

* Ti-Yen
* halo close to beamcenter makes hitfinding diffcult for SPI
* converting ADU to photons is sufficient for SPI

* Aaron Brewster
* reprocessing because of unkown crystal unit cell
* unit cell can drift during the experiment, depending on sample preperation

* Peter Zwart
* hitrate can be quite high for imaging can be up to 80%
* clustering algorithm for hitfinding
* stable beam and sample delivery required
* difficulties in converting ADU to photons for pnccd, rounding errors
* check photon conversion on simulated data, understand errors in conversion

 


Extra detail from Anton Barty:

 


The question was: when do you of back over all the data.

...

- All analysis must be monitored and reprogrammed in real time.  LCLS will have to understand a lot more about each experiment to be able to provide the necessary support in real time, at all hours. Record and figure it out later no longer possible.
Thoughts from Tim van Driel:
The full data analysis still benefits some data analysis as the higher dimensionality of the data makes it easier to extract non-linear behavior and correlated fluctuations.
If we are careful with measurement, and the diagnostic tools perform adequately we can instead rely on littledata and cube. To fully rely on littledata and cube, we would require both for future experiments as they are sensitive/insensitive to different types of errors. 
When going from Full data analysis to cube/littledata the same corrections are needed all in all, but the necessity differs from littledata to cube processing.
If new detectors behave less ideally than the CSPAD does now, we are back to needing the full datasets to develop the necessary filtering and corrections.

A quick note regarding radial integration: All pump-probe diffuse scattering experiments have anisotropy at early times (usually <10ps, but can be up to ns) it may be negligible if the solute signal is relatively large as for protein crystallography. The anisotropy can be separated using legendre filters of different order but is probably easiest to do on the fly by integrating the data along phi and theta. I would use at least 17 bins which makes the assumption of 1e3 reduction for diffuse scattering 1e2 instead.

Reasons for reprocessing the data:
Full data analysis (used on experiments before 2016)
 - Detector calibration
 - Detector geometry
 - Common mode subtraction
 - Sample-detector distance
 - Correlated behavior, outliers non-linear corrections
 - Time-tool calibration
 - Masking
 - Binning
 - Experimental detector corrections (solid angle coverage, polarization, jet geometry, sample composition)

Littledata (used from 2016)
 - Detector calibration
 - Detector geometry
 - Common mode subtraction
 - Sample-detector distance
 - Correlated behavior, outliers non-linear corrections
 - Masking
 - Experimental detector corrections (solid angle coverage, polarization, jet geometry, sample composition)

Cube (used from 2016)
 - Detector calibration
 - Common mode subtraction
 - Correlated behavior, outliers non-linear corrections
 - Time-tool calibration
 - Binning
 - Outlier rejection (usually based on littledata analysis)

XES (dispersed spectral signal on small area detector)
 CSPAD 140k (before 2016)
  - Detector calibration
  - Common mode subtraction  
  - filtering based on XDS
  - pixel-by-pixel analysis to separate 1-photon peak from noise, the choice of algorithm depends on the signal strength on the detector
  - masking
 EPIX
  - Detector calibration
  - Dropletizing parameters
  - Droplet output

XAS (0d signal on a diode or on a small area detector)
  - Detector calibration
  - Common mode subtraction  
  - masking

Feb. 22, 2017: "Introduction To Data Reduction"

Link to slides: 

Slides from Feb. 22, 2017 


Berkeley:
Great goal x10 reduction

...