Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Alternatively, there are 'smart' versions of each runjob script that is are meant to avoid jobs that are destined to fail. These 'smartrunjobs' first check to see if every input file to be used exists, and also that none of the output files exist. If everything is correct, it either submits the jobs, or gives further instructions for how to do so.

...

  • ebeam.csh:  given the beam energy string (e.g. “1pt05”) returns the beam energy in MeV
  • dz.csh:  given the beam energy string, returns the target thickness
  • ne.csh:  given the beam energy string, returns the number of electrons per 2ns bunch
  • apmass.csh: given the beam energy string, returns as list of A’ masses to simulate

Event Generation:  

As mentioned, the event generation is performed via MadGraph, EGS5, or GEANT4 depending on the type.  Below lists the different type of particles produced, and some instructions on each.

...

  • beam electrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml
    •  run with ./runjob.sh stdhep/beam.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is /u/group/hps/production/mc/egs5/beam_v3.exe
    • output is an .stdhep file with each event being 1 scattered electron (I think…1 event _may_ be 1 bunch) 
  • beam hadrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/hadrons.xml
    •  run with ./runjob.sh stdhep/hadrons.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is a GEANT4 release that gets copied locallyorg
    • output is an .stdhep file with each event being hadrons from bunch
  • unbiased tridents == “tri”:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tri.xml
    •  run with ./runjob.sh lhe/tri.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/trigg/run_card_<ebeam>.dat
    • output is an lhe file with each event being 1 trident
    • these files are small, so they are bundled 100-to-1 with:
      • ./runjob_100to1.sh lhe/tri_merge_100to1.xml 1pt1 1 100
  • enhanced tridents == “tritrig”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tritrig.xml
    •  run with ./runjob.sh lhe/tritrig.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/tritrig/run_card_<ebeam>.dat
      • the cuts here are E(e+e-)>0.8 EBeam and m(e+e-)>10MeV
    • output is an lhe file with each event being 1 trident
  • signal events ==  “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/ap-truth.xml
    •  run with ./runjob.sh lhe/ap-truth.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/ap/run_card_<ebeam>.dat
    • also converts the lhe to stdhep while adding displaced vertex (though now ct is hardcoded to 0 in the .xml)
    • output is an stdhep file with each event being 1 trident

 

Detector Simulation

 

The detector simulation is performed by slic, which is itself a front-end for GEANT4 which helps incorporate the specific geometry HPS uses along with putting the GEANT output into LCIO collections that are then used by the reconstruction in hps-java.  The batch scripts do more than just run slic though; they may also mix components first, such as with the beam overlay. Below is what we do for each sample type:

...

  • beam simulation == “beam-tri”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/beam-tri_100.xml
    •  run with ./runjob_100.sh slic/beam-tri_100.xml 1pt05 <firstnum> <lastnum>
    •  this submits 100 jobs for each num between <firstnum> and <lastnum>, combining beam, tri, and hadron events into 2ns beam “bunches”, 1 per event
      •  e.g. <firstnum>=11, <lastnum>=20 submits 100 + 100*(20-11) =1000 jobs; , since each input # is used to make 100 output files use each input #. This example makes files 1001-2000.
      • inputs: 1 .stdhep file each of beam, hadrons, and tri (bundled 100to1) for each num
      • outputs: 100 .slcio files for each num, each sampled differently from the mixed input files
  • enhanced tridents == “tritrig”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/slic/tritrig.xml
    •  run with ./runjob.sh slic/tritrig.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over the tridents, 1 trident-per-event
  • signal events == “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/aptruth.xml
    •  run with ./runjob.sh slic/aptruth.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over signal events, 1 signal event-per-event

...

  • beam background readout == "beam-tri"
    • auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/readout/ beambeam-tri_<trigger>_100to1.xml
      • <trigger> can be 'singles' (event occurs for each ECal cluster passing the cuts -> ~100k events) or 'pairs' (2 clusters on opposite sides pass the cuts within a time window -> ~1k events) 
    •  run with ./runjob_100to1.sh <auger xml> 1pt05 <firstnum> <lastnum>
    •  this submits 1 job for each num between <firstnum> and <lastnum>, combining the simulated readout using the 100 files from the 'Detector Simulation' step into 1 output file
      • e.g. <firstnum>=11, <lastnum>=20 submits 1 + (20-11) = 10 jobs, which makes output files 11 through 20
      • inputs: 100 sampled .slcio files for each num between <firstnum> and <lastnum>, e.g. num=12 takes in files 1100-1200 as input
      • output: 1 readout .slcio file: the combined readout of the inputs
         
     
  • enhanced tridents with beam background== “tritrig-beam-tri”
    •  COMING SOONauger xml: /u/group/hps/production/mc/EngRun2015Scripts/readout/tritrig-beam-tri_<trigger>.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum>