Overview: 

The following is how 'official' MC production is typically done, by submitting jobs to the JLab batch farm using xml scripts. To run these same steps from a command line with customized jar or steering files, please see 2015 MC Production using a Command Line.

There are 4 steps to MC production:  event generation, detector simulation, readout simulation, and reconstruction. The tools used are: 

  • Event Generation:  
    • MadGraph:  tridents & signal events 
    • EGS5:  beam-target interactions (includes Mollers), wide-angle bremstralung 
    • GEANT4:  hadrons from the beam-target interaction
  • Detector Simulation: 
    • slic (GEANT4 based)
  • Readout Simulation: 
    • hps-java (on slic output)
  • Reconstruction: 
    • hps-java (on readout output, gives a recon .slcio file)
    • dst-maker & lcio api (gives a root file)

Currently, each step is performed in a separate batch job. The following event samples are made:

  • "beam-tri":  simulates the beam going though the target; the “tri” refers to trident events that are inserted at the expected rate.  This is intended to be an unbiased representation of our data
  • “tritrig-beam-tri”:  the “tritrig” refers to a sample of tridents that are preselected (at the MadGraph level) to be more likely to be in the detector acceptance and pass the trigger cuts so that we can efficiently get a large sample. These then have “beam-tri” overlaid in order to simulate what trident events look like in the detector. 
  • "wab-beam-tri": beam-tri with wide-angle bremsstrahlung events mixed in.
  • “ap” and “ap-beam-tri”:  the “ap” stands for “A prime” and is our signal MC. They can be generated at a given mass and decay length (ctau), and are produced with and w/o beam overlay.  
  • "moller" pure moller events. Note: beam backgrounds already contain moller events. "moller" is intended to be used for specialized studies involving them.

Below, each generation step for each sample is outlined. 

Batch Job Submission:

 

There is a basic example on the confluence here:

https://confluence.slac.stanford.edu/display/hpsg/Example+of+using+the+JLAB+Auger+system

Official production runs from here:

/u/group/hps/production/mc/EngRun2015Scripts

and the user should be logged in as 'hps'. You need to be put on the list to log in as this user, but once you are you don’t need a password; just log into JLAB unix as yourself and then ssh hps@ifarm65 (or any other machine). 

There are two parts to the job submission:  the xml template that gets parsed and runs in the batch machine and the script (currently /bin/tcsh) that manipulates the xml template pre-submission and executes the “jsub” command to submit to the batch queue. An example of the xml template is at:

/u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml

The basic format of this xml file is: 

<Request>

 <AugerJobParameter parameter=“blah”/>

 …  some more Auger parameters  …

 <Variable name=“internal_variable”/>

 …  some more variables  …

 <List name=“list_variable”>l1 l2 l3 … </List>

 <ForEach list=“list_variable”>

  <Job>

   <Command><![CDATA[

    …shell script that executes some stuff ...

   ]]></Command>

   <Output src=“out.slcio" dest="${out_dir}/${ebeam}/${out_file}_${num}.slcio" />

   <Stderr dest="${log_dir}/${ebeam}/${out_file}_${num}.err" />

   <Stdout dest="${log_dir}/${ebeam}/${out_file}_${num}.out" />

  </Job>

 </ForEach>

</Request>

This xml get’s executed on the batch farm, but to submit it there we use a shell script like this: 

/u/group/hps/production/mc/EngRun2015Scripts/runjob.sh
hps@ifarm1401> cat runjob.sh 
#!/bin/tcsh
if ( $#argv != 4 ) then
 echo "$0 <filename> <ebeam> <firstnum> <lastnum>"
 echo "starts 1 job per num, 1 input file per job"
 exit
endif
set nums=`seq $3 $4`
cp $1 temp.xml
set apmass=`/u/group/hps/production/mc/run_params/apmass.csh $2`
sed -i '/List .*\"num\"/{s/>.*</>'"$nums"'</}' temp.xml
sed -i '/List .*\"apmass\"/{s/>.*</>'"$apmass"'</}' temp.xml
sed -i '/Variable.*\"ebeam\"/{s/value=\".*\"/value=\"'"$2"'\"/}' temp.xml
#cat temp.xml
jsub -xml temp.xml

The last line, with “jsub” is the line that actually does the submission to the queue.


IMPORTANT: Always submit the xml script with the proper type of runjob! (The xml header will remind you of this as well) e.g. beam-tri_100.xml is submitted with ./runjob_100.sh, and  beam-tri_100to1.xml with ./runjob_100to1.sh

Alternatively, there are 'smart' versions of each runjob script that are meant to avoid jobs that are destined to fail. These 'smartrunjobs' first check to see if every input file to be used exists, and also that none of the output files exist. If everything is correct, it either submits the jobs, or gives further instructions for how to do so.

'smartrunjob_v1' takes in the beam energy in the same place as the standard runjobs.

Run with: python smartrunjob_v1.py <job template> <energy> <firstnum> <lastnum>   

'smartrunjob_v2.py' takes in a variable template (e.g. EngRun2015params.xml) in place of the beam energy, which sets any desired run parameters before submitting jobs. (Check this template first!)

Run with: python smartrunjob_v2.py <job template> <variable template> <firstnum> <lastnum>

More details on submitting jobs with 'runjob.sh' can be found in the following sections.


A few other helper scripts that are used in the auger xml are in: /u/group/hps/production/mc/run_params

They are: 

  • ebeam.csh:  given the beam energy string (e.g. “1pt05”) returns the beam energy in MeV
  • dz.csh:  given the beam energy string, returns the target thickness
  • ne.csh:  given the beam energy string, returns the number of electrons per 2ns bunch
  • apmass.csh: given the beam energy string, returns as list of A’ masses to simulate
  • mu.csh: only used by lhe generators (e.g. tridents and wab), sets the number of bunches for each component

Event Generation: 

The event generation is performed via MadGraph, EGS5, or GEANT4 depending on the type.  Below lists the different type of particles produced, and some instructions on each.

 

  • beam electrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml
    •  run with ./runjob.sh stdhep/beam.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is /u/group/hps/production/mc/egs5/beam_v3.exe
    • output is an .stdhep file with each event being the products from 1 scattered electron
  • beam hadrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/hadrons.xml
    •  run with ./runjob.sh stdhep/hadrons.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is a GEANT4 release that gets copied locallyorg
    • output is an .stdhep file with each event being hadrons from bunch
  • unbiased tridents == “tri”:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tri.xml
    •  run with ./runjob.sh lhe/tri.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/MadGraph/trigg/run_card_<ebeam>.dat
    • output is an lhe file with each event being 1 trident
    • these files are small, so they are bundled 100-to-1 with:
      • ./runjob_100to1.sh lhe/tri_merge_100to1.xml 1pt05 1 100
  • enhanced tridents == “tritrig”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tritrig.xml
    •  run with ./runjob.sh lhe/tritrig.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/MadGraph/tritrig/run_card_<ebeam>.dat
      • the cuts here are E(e+e-)>0.8 EBeam and m(e+e-)>10MeV
    • output is an lhe file with each event being 1 trident
  • wide-angle bremsstrahlung == “wab”:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/wab.xml
    •  run with ./runjob.sh lhe/wab.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/MadGraph/wab/run_card_<ebeam>.dat
    • output is an lhe file with each event being 1 wab event
    • cross section is 0.7 barns, 6M events/s at 50 nA
    • cuts: E(photon) > 50 MeV, |theta_y(photon)| > 10 mrad, no cuts on e-
    • like the tridents, these files are also bundled 100-to-1 with:
      • ./runjob_100to1.sh lhe/wab_merge_100to1.xml 1pt05 1 100
  • signal events ==  “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/ap-truth.xml
    •  run with ./runjob.sh lhe/ap-truth.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/ap/run_card_<ebeam>.dat
    • also converts the lhe to stdhep while adding displaced vertex (though now ct is hardcoded to 0 in the .xml)
    • output is an stdhep file with each event being 1 trident
  • moller electrons == "moller":  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/mollers.xml
    •  run with ./runjob.sh stdhep/mollers.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is /u/group/hps/production/mc/egs5/moller_v1.exe
    • output is an .stdhep file with each event containing a moller pair (93% efficiency) 
    • cross section: 0.09 barns, 800k events/s at 50 nA (258 events per 10^8 incident e-)
    • cuts: E > 10 MeV, |theta_y| > 10 mrad

Detector Simulation

The detector simulation is performed by slic, which is itself a front-end for GEANT4 which helps incorporate the specific geometry HPS uses along with putting the GEANT output into LCIO collections that are then used by the reconstruction in hps-java.  The batch scripts do more than just run slic though; they may also mix components first, such as with the beam overlay. Below is what we do for each sample type:

 

  • beam simulation == “(wab-)beam-tri”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/(wab-)beam-tri_100.xml
    •  run with ./runjob_100.sh slic/(wab-)beam-tri_100.xml 1pt05 <firstnum> <lastnum>
    •  this submits 100 jobs for each num between <firstnum> and <lastnum>, combining beam, tri, and hadron events into 2ns beam “bunches”, 1 per event
      •  e.g. <firstnum>=11, <lastnum>=20 submits 100 + 100*(20-11) =1000 jobs, since each input # is used to make 100 output files. This example makes files 1001-2000.
      • inputs: 1 .stdhep file each of beam, hadrons, and tri (bundled 100to1) for each num
      • outputs: 100 .slcio files for each num, each sampled differently from the mixed input files
  • enhanced tridents == “tritrig”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/slic/tritrig.xml
    •  run with ./runjob.sh slic/tritrig.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over the tridents, 1 trident-per-event
  • signal events == “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/aptruth.xml
    •  run with ./runjob.sh slic/aptruth.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over signal events, 1 signal event-per-event
  • moller events == “moller”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/moller.xml
    •  run with ./runjob.sh slic/moller.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over moller events, 1 moller pair per event

Readout Simulation

Simulation of the SVT and ECal digital readout is performed using the hps-java framework.


  • beam background readout == "beam-tri"
    • auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/readout/beam-tri_<trigger>_100to1.xml
      • <trigger> can be 'singles' (event occurs for each ECal cluster passing the cuts -> ~100k events) or 'pairs' (2 clusters on opposite sides pass the cuts within a time window -> ~1k events) 
    •  run with ./runjob_100to1.sh <auger xml> 1pt05 <firstnum> <lastnum>
    •  this submits 1 job for each num between <firstnum> and <lastnum>, combining the simulated readout using the 100 files from the 'Detector Simulation' step into 1 output file
      • e.g. <firstnum>=11, <lastnum>=20 submits 1 + (20-11) = 10 jobs, which makes output files 11 through 20
      • inputs: 100 sampled .slcio files for each num between <firstnum> and <lastnum>, e.g. num=12 takes in files 1100-1200 as input
      • output: 1 readout .slcio file: the combined readout of the inputs

     
  • enhanced tridents with beam background== “tritrig-beam-tri”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/readout/tritrig-beam-tri_<trigger>.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum>
  • signal events== “mock”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/readout/mock.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum>
  • moller events == “moller”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/readout/moller_<trigger>_100to1.xml
    •  run with ./runjob_100to1.sh <auger xml> 1pt05 <firstnum> <lastnum>
    • moller files individually processed by slic are readout in groups of 100to1

Reconstruction

Once the readout is obtained, tracks can be reconstructed, creating HPS-Event objects containing pertinent quantities (cluster energy, etc.) that may be analyzed


  • beam background readout == "(wab-)beam-tri"
    • auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/recon/(wab-)beam-tri_<trigger>.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum> 
     
  • enhanced tridents with beam background== “tritrig-beam-tri”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/recon/tritrig-beam-tri_<trigger>.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum>
  • signal events== “mock”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/recon/mock.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum>
  • moller events == “moller”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/recon/moller_<trigger>.xml
    •  run with ./runjob.sh <auger xml> 1pt05 <firstnum> <lastnum>





  • No labels