Beam Test Pipeline Overview
The pipeline automatically retrieves all online data produced by LATTE which are currently stored in directories associated with runs numbers and brings them to the SLAC farm. After that, it populates an ORACLE database which provide queries to the data. The pipeline also creates reports, and launches data processing/reconstruction code to produce data files and high level analysis ntuples.
A pipeline diagram can be seen below
The xml files used for upload are located in
Need an explanation of how the tasks get launched by FastCopy
Some existing documentation can be found in /afs/slac/g/glast/ground/PipelineConfig/BeamTest-tasks/beamtestPipeline/current/doc.
install.txt and operation.txt may both be read directly. Running make in the doc directory will use IAndTPipeline.tex and several .dot files to make a .pdf.
Policy for updating tasks
Describes policy and steps for updating
Environmental Variables
Setting up the environment so that one can acess the ORACLE database
Source /u/gl/glast/pdb_config/dpf_config_prod.csh (or .sh if you use BASH) to run the pipeline text-mode management tools, which are installed in $PDB_HOME (which is set by the config script).
Pipeline Tasks and Associated Scripts
Each pipeline tasks consists of several shell, python and perl scripts.
The code for the bt pipeline is found in /afs/slac/g/glast/ground/PipelineConfig/BeamTest-tasks/beamtestPipeline/current
The cvs repository for the code is here. It contains a branch with the following features:
- sdsa
- dsa
- sad
The list of pipeline taks is provided below with the information how to run them.
Task Name: eLogupdate
Description: loads the database
Purpose |
Associated Scripts |
Input |
Output |
Comments |
---|---|---|---|---|
|
archiveWrapper.pl |
|
|
|
|
ConfTLaunchWrapper.pl |
|
|
|
|
decideDigi.pl |
|
|
|
|
genXml.pl |
|
|
|
|
ldfTDLaunchWrapper.pl |
|
|
|
|
populateElogDb.pl |
|
|
|
|
populateElogDbWrapper.pl |
|
|
|
|
retDefTDLaunchWrapper.pl |
|
|
|