Page History
...
Instructions for submitting batch jobs to run in parallel are here: Batch System Analysis Jobs
Analyzing Scans
Code Block |
---|
(ps-4.1.4) psanagpu104:lcls2$ detnames -s exp=rixdaq18,run=17
--------------------------
Name | Data Type
--------------------------
motor1 | raw
motor2 | raw
step_value | raw
step_docstring | raw
--------------------------
(ps-4.1.4) psanagpu104:lcls2$ |
Code Block |
---|
from psana import DataSource
ds = DataSource(exp='rixdaq18',run=17)
myrun = next(ds.runs())
motor1 = myrun.Detector('motor1')
motor2 = myrun.Detector('motor2')
step_value = myrun.Detector('step_value')
step_docstring = myrun.Detector('step_docstring')
for step in myrun.steps():
print(motor1(step),motor2(step),step_value(step),step_docstring(step))
for evt in step.events():
pass |
MPI Task Structure
To allow for scaling, many hdf5 files are written, one per "SRV" node. The total number of SRV nodes is defined by the environment variable PS_SRV_NODES (defaults to 0). These many hdf5 files are joined by psana into what appears to be one file using the hdf5 "virtual dataset" feature.
...
Overview
Content Tools