Page History
...
The most pain-free way to access LCLS XTC data files from python is through LCLS's python framework, pyana. It is a non-interactive framework, but to some extent you can work interactively with the data it produces
All about pyana.
C++ framework: psana
The idea is the same as for pyana. Non-interactive. No interactive support as of yet.
All about psana - Original Documentation.
If you like GUIs:
The XTC Explorer - Old gives you an "interactive" way to configure your analysis.
...
Python/IPython can be used to analyze data after you've saved them, or they can be embedded into a pyana module to give you interactive access to the data at regular intervals throughout your analysis.
'IPython' (http://ipython.org/) is an enhanced python shell for interactive use. Many of the examples here would work equally well with a 'regular' python shell.
Plotting is done with 'matplotlib' (http://matplotlib.sourceforge.net/)
If you're looking for an IDE to work with, consider 'Spyder' (http://code.google.com/p/spyderlib/).
Interactively exploring the XTC file.
...
With interactive python embedded, see: https://confluence.slac.stanford.edu/display/PCDS/XTC+Explorer#XTCExplorer-InteractiveplottingwithIPython
IPython used "like" MATLAB
...
Table of comparison (MATLAB vs MatPlotLib)
See also http://www.scipy.org/NumPy_for_Matlab_Users
MatLab | MatPlotLib | Comments | ||||
---|---|---|---|---|---|---|
Loglog plot of one array vs. another
| Loglog plot of one array vs. another
| channels is a 4xN array of floats, where N is the number of events. Each column corresponds to one out of four Ipimb channels. | ||||
test | test | Test | ||||
array of limits from graphical input | array of limits from graphical input |
| ||||
|
| In MatLab, | ||||
|
|
| ||||
filter | filter |
| ||||
|
| Comment | ||||
|
|
|
Writing Numpy and HDF5 files from python
You can store numpy arrays from a pyana job (reads XTC) and store them in simple numpy files or HDF5 files. Here are some examples:
Simple array to a NumPy file:
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
import numpy as np np.save("filename.npy", array) array = np.load("filename.npy") np.savetxt("filename.dat", array) array = loadtxt("filename.dat") |
This example shows saving and loading of a binary numpy file (.npy) and an ascii file (.dat).
This only works with single arrays (max 2 dimensions).
If you need to save multiple events/shots in the same file you will need to do some tricks (e.g. flatten the array and stack 1d arrays into 2d arrays where axis2 represent event number). Or you could save as an HDF5 file.
Simple array to an HDF5 file
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
import h5py def beginjob(self,evt,env): self.ofile = h5py.File("outputfile.hdf5", 'w') # open for writing (overwrites existing file) self.shot_counter = 0 def event(self,evt,env) # example: store several arrays from one shot in a group labeled with shot (event) number self.shot_counter += 1 group = self.ofile.create_group("Shot%d" % self.shot_counter) image1_source = "CxiSc1-0|TM6740-1" image2_source = "CxiSc1-0|TM6740-2" frame = evt.getFrameValue(image1_source) image1 = frame.data() frame = evt.getFrameValue(image2_source) image2 = frame.data() dataset1 = group.create_dataset("%s"%image1_source,data=image1) dataset2 = group.create_dataset("%s"%image2_source,data=image2) def endjob(self,env) self.ofile.close() |
This example is shown in a pyana setting. The HDF5 file is declared and opened in beginjob, datasets created for each event, and the file is closed in the endjob method.
Or you can group your datasets any other way you find useful, of course.
Saving complex datasets to HDF5 file
Some more advanced examples (courtesy of Hubertus Bromberger):
...