You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 14 Next »

Overview: 

There are 4 steps to MC production:  event generation, detector simulation, readout simulation, and reconstruction. The tools used are: 

  • Event Generation:  
    • MadGraph:  tridents & signal events 
    • EGS5:  beam-target interactions (includes Mollers), wide-angle bremstralung 
    • GEANT4:  hadrons from the beam-target interaction
  • Detector Simulation: 
    • slic (GEANT4 based)
  • Readout Simulation: 
    • hps-java (on slic output)
  • Reconstruction: 
    • hps-java (on readout output…give a recon .slcio file)
    • dst-maker & lcio api (on recon output…gives a root file)

Currently, each step is performed in a  separate batch job though it may be possible to combine some to save wall & cpu time and tape storage.  Something to keep in mind...

We make the following event samples:

  • "beam-tri":  simply simulates the beam going though the target; the “tri” refers to trident events that are inserted at the expected rate.  This is intended to be an unbiased representation of our data
  • “tritrig-beam-tri”:  the “tritrig” refers to a sample of tridents that are preselected (at the MadGraph level) to be more likely to be in the detector acceptance and pass the trigger cuts so that we can efficiently get a large sample.  These are then have “beam-tri” overlaid in order to simulate what trident events look like in the detector.  
  • “ap” and “ap-beam-tri”:  the “ap” stands for “A prime” and is our signal MC…they can be generated at a given mass and ctau, and are produced with and w/o beam overlay (mostly with).  
  • Moller and wide angle bremsstrahlung samples

Below, each generation step for each sample is outlined. 

Batch Job Submission:

 

There is a basic example on the confluence here:

https://confluence.slac.stanford.edu/display/hpsg/Example+of+using+the+JLAB+Auger+system

Official production runs from here:

/u/group/hps/production/mc/EngRun2015Scripts

and the user should be logged in as 'hps'. You need to be put on the list to log in as this user, but once you are you don’t need a password; just log into JLAB unix as yourself and then ssh hps@ifarm65 (or any other machine). 

There are two parts to the job submission:  the xml template that gets parsed and runs in the batch machine and the script (currently /bin/tcsh) that manipulates the xml template pre-submission and executes the “jsub” command to submit to the batch queue. An example of the xml template is at:

/u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml

The basic format of this xml file is: 

<Request>

 <AugerJobParameter parameter=“blah”/>

 …  some more Auger parameters  …

 <Variable name=“internal_variable”/>

 …  some more variables  …

 <List name=“list_variable”>l1 l2 l3 … </List>

 <ForEach list=“list_variable”>

  <Job>

   <Command><![CDATA[

    …shell script that executes some stuff ...

   ]]></Command>

   <Output src=“out.slcio" dest="${out_dir}/${ebeam}/${out_file}_${num}.slcio" />

   <Stderr dest="${log_dir}/${ebeam}/${out_file}_${num}.err" />

   <Stdout dest="${log_dir}/${ebeam}/${out_file}_${num}.out" />

  </Job>

 </ForEach>

</Request>

This xml get’s executed on the batch farm, but to submit it there we use a shell script like this: 

/u/group/hps/production/mc/EngRun2015Scripts/runjob.sh
hps@ifarm1401> cat runjob.sh 
#!/bin/tcsh
if ( $#argv != 4 ) then
 echo "$0 <filename> <ebeam> <firstnum> <lastnum>"
 echo "starts 1 job per num, 1 input file per job"
 exit
endif
set nums=`seq $3 $4`
cp $1 temp.xml
set apmass=`/u/group/hps/production/mc/run_params/apmass.csh $2`
sed -i '/List .*\"num\"/{s/>.*</>'"$nums"'</}' temp.xml
sed -i '/List .*\"apmass\"/{s/>.*</>'"$apmass"'</}' temp.xml
sed -i '/Variable.*\"ebeam\"/{s/value=\".*\"/value=\"'"$2"'\"/}' temp.xml
#cat temp.xml
jsub -xml temp.xml

 

The last line, with “jsub” is the line that actually does the submission to the queue.

IMPORTANT: Always submit the xml script with the proper type of runjob! (The xml header will remind you of this as well) e.g. beam-tri_100.xml is submitted with ./runjob_100.sh, and  beam-tri_100to1.xml with ./runjob_100to1.sh                                                         Alternatively, there are 'smart' versions of each runjob script that is meant to handle large numbers of files. These 'smartrunjobs' first check to see if every input file to be used exists, and also that none of the output files exist. If everything is correct, it either submits the jobs, or gives further instructions for how to do so.

'smartrunjob_v1' takes in the beam energy in the same place as the standard runjobs.

Run with: python smartrunjob_v1.py <job template> <energy> <firstnum> <lastnum>   

'smartrunjob_v2.py' takes in a variable template (e.g. EngRun2015params.xml) in place of the beam energy, which sets any desired run parameters before submitting jobs. (Check this template first!)

Run with: python smartrunjob_v2.py <job template> <variable template> <firstnum> <lastnum>


More details on submitting jobs with 'runjob.sh' can be found below.

A few other helper scripts that are used in the auger xml are in:                                                                                                                                                                                                                                                                                                 /u/group/hps/production/mc/run_params

They are: 

  • ebeam.csh:  given the beam energy string (e.g. “1pt05”) returns the beam energy in MeV
  • dz.csh:  given the beam energy string, returns the target thickness
  • ne.csh:  given the beam energy string, returns the number of electrons per 2ns bunch
  • apmass.csh: given the beam energy string, returns as list of A’ masses to simulate

Event Generation: 

 

As mentioned, the event generation is performed via MadGraph, EGS5, or GEANT4 depending on the type.  Below lists the different type of particles produced, and some instructions on each.

 

  • beam electrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml
    •  run with ./runjob.sh stdhep/beam.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is /u/group/hps/production/mc/egs5/beam_v3.exe
    • output is an .stdhep file with each event being 1 scattered electron (I think…1 event _may_ be 1 bunch) 
  • beam hadrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/hadrons.xml
    •  run with ./runjob.sh stdhep/hadrons.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is a GEANT4 release that gets copied locallyorg
    • output is an .stdhep file with each event being hadrons from bunch
  • unbiased tridents == “tri”:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tri.xml
    •  run with ./runjob.sh lhe/tri.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/trigg/run_card_<ebeam>.dat
    • output is an lhe file with each event being 1 trident
    • these files are small, so they are bundled 100-to-1 with:
      • ./runjob_100to1.sh lhe/tri_merge_100to1.xml 1pt1 1 100
  • enhanced tridents == “tritrig”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tritrig.xml
    •  run with ./runjob.sh lhe/tritrig.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/tritrig/run_card_<ebeam>.dat
      • the cuts here are E(e+e-)>0.8 EBeam and m(e+e-)>10MeV
    • output is an lhe file with each event being 1 trident
  • signal events ==  “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/ap-truth.xml
    •  run with ./runjob.sh lhe/ap-truth.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/ap/run_card_<ebeam>.dat
    • also converts the lhe to stdhep while adding displaced vertex (though now ct is hardcoded to 0 in the .xml)
    • output is an stdhep file with each event being 1 trident

 

Detector Simulation

 

The detector simulation is performed by slic, which is itself a front-end for GEANT4 which helps incorporate the specific geometry HPS uses along with putting the GEANT output into LCIO collections that are then used by the reconstruction in hps-java.  The batch scripts do more than just run slic though; they may also mix components first, such as with the beam overlay. Below is what we do for each sample type:

 

  • beam simulation == “beam-tri”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/beam-tri_100.xml
    •  run with ./runjob_100.sh slic/beam-tri_100.xml 1pt05 <firstnum> <lastnum>
    •  this submits 100 jobs for each num between <firstnum> and <lastnum>, combining beam, tri, and hadron events into 2ns beam “bunches”, 1 per event
      •  e.g. <firstnum>=11, <lastnum>=20 submits 100 + 100*(20-11) =1000 jobs; 100 output files use each input #
      • inputs: 1 .stdhep file each of beam, hadrons, and tri (bundled 100to1) for each num
      • outputs: 100 .slcio files for each num, each sampled differently from the mixed input files
  • enhanced tridents == “tritrig”
    •  auger xml: /u/group/hps/production/mc/EngRun2015Scripts/slic/tritrig.xml
    •  run with ./runjob.sh slic/tritrig.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over the tridents, 1 trident-per-event
  • signal events == “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/aptruth.xml
    •  run with ./runjob.sh slic/aptruth.xml 1pt05 <firstnum> <lastnum>
    •  runs slic over signal events, 1 signal event-per-event


Readout Simulation


Simulation of the SVT and ECal digital readout is performed using the hps-java framework.


  • beam background readout == "beam-tri"
    • auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/readout/ beam-tri_<trigger>_100to1.xml
      • <trigger> can be 'singles' (event occurs for each ECal cluster passing the cuts -> ~100k events) or 'pairs' (2 clusters on opposite sides pass the cuts within a time window -> ~1k events) 
    •  run with ./runjob_100to1.sh <auger xml> 1pt05 <firstnum> <lastnum>
    •  this submits 1 job for each num between <firstnum> and <lastnum>, combining the simulated readout using the 100 files from the 'Detector Simulation' step into 1 output file
      • e.g. <firstnum>=11, <lastnum>=20 submits 1 + (20-11) = 10 jobs, which makes output files 11 through 20
      • inputs: 100 sampled .slcio files for each num between <firstnum> and <lastnum>, e.g. num=12 takes in files 1100-1200 as input
      • output: 1 readout .slcio file: the combined readout of the inputs
         

     
  • enhanced tridents with beam background== “tritrig-beam-tri”
    •  COMING SOON
  • No labels