Page History
...
MatLab | MatPlotLib | Comments | |||||
---|---|---|---|---|---|---|---|
Loglog plot of one array vs. another
| Loglog plot of one array vs. another
| channels is a 4xN array of floats, where N is the number of events. Each column corresponds to one out of four Ipimb channels. | ]]></ac:plain-text-body></ac:structured-macro> | ||||
test | test | Test | |||||
array of limits from graphical input | array of limits from graphical input |
| |||||
|
| In MatLab, | |||||
|
|
| |||||
filter | filter |
| |||||
|
| Comment | |||||
|
|
|
Writing Numpy and HDF5 files from python
You can store numpy arrays from a pyana job (reads XTC) and store them in simple numpy files or HDF5 files. Here are some examples:
Simple array to a NumPy file:
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
import numpy as np np.save("filename.npy", array) array = np.load("filename.npy") np.savetxt("filename.dat", array) array = loadtxt("filename.dat") |
This example shows saving and loading of a binary numpy file (.npy) and an ascii file (.dat).
This only works with single arrays (max 2 dimensions).
If you need to save multiple events/shots in the same file you will need to do some tricks (e.g. flatten the array and stack 1d arrays into 2d arrays where axis2 represent event number). Or you could save as an HDF5 file.
Simple array to an HDF5 file
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
import h5py def beginjob(self,evt,env): self.ofile = h5py.File("outputfile.hdf5", 'w') # open for writing (overwrites existing file) self.shot_counter = 0 def event(self,evt,env) # example: store several arrays from one shot in a group labeled with shot (event) number self.shot_counter += 1 group = self.ofile.create_group("Shot%d" % self.shot_counter) image1_source = "CxiSc1-0|TM6740-1" image2_source = "CxiSc1-0|TM6740-2" frame = evt.getFrameValue(image1_source) image1 = frame.data() frame = evt.getFrameValue(image2_source) image2 = frame.data() dataset1 = group.create_dataset("%s"%image1_source,data=image1) dataset2 = group.create_dataset("%s"%image2_source,data=image2) def endjob(self,env) self.ofile.close() |
This example is shown in a pyana setting. The HDF5 file is declared and opened in beginjob, datasets created for each event, and the file is closed in the endjob method.
Or you can group your datasets any other way you find useful, of course.
Saving complex datasets to HDF5 file
Some more advanced examples (courtesy of Hubertus Bromberger):
...