You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

The data quality management system provides important histograms that allow subsystems experts to check the quality of the data that is being taken soon after it is taken.  

 

 

The DQM is run using a swif workflow, which is a collection of auger jsub requests.  There are 3 phases of each workflow:

  1. Run Recon and DQM on the first 10k events in individual evio files.  This phase has one job per evio file, and it also takes the longest amount of time per job
  2. Sum the DQM files using hadd and then renormalize the SVT occupancy plots.
  3. Add all the DQM root files to the data catalog.  

 

 

 

To run DQM:

  1. Check to make sure that all the evio files have been copied from the counting house to the mss tape before proceeding
  2. Logged into one of the ifarm machines at jlab as hps, execute the following commands to create the workflow and run it. (replace runnumber with the actual run number)
    • cd /group/hps/production/dqm/scripts
    • python mkworkflow.py runnumber --request



The DQM files will be found in /work/hallb/hps/physrun2016/pass0/dqm

An example of a name of a dqm output file name generated from a single evio file is: 

       hps_7373.0_dqm_3.6-SNAPSHOT.root

An example of a combined dqm output file is: 

   hps_7373_dqm_3.6-SNAPSHOT.root









































































  • No labels