You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Overview: 

There are 4 steps to MC production:  event generation, detector simulation, readout simulation, and reconstruction. The tools used are: 

  • Event Generation:  
    • MadGraph:  tridents & signal events 
    • EGS5:  beam-target interactions (includes Mollers), wide-angle bremstralung 
    • GEANT4:  hadrons from the beam-target interaction
  • Detector Simulation: 
    • slic (GEANT4 based)
  • Readout Simulation: 
    • hps-java (on slic output)
  • Reconstruction: 
    • hps-java (on readout output…give a recon .slcio file)
    • dst-maker & lcio api (on recon output…gives a root file)

Currently, each step is performed in a  separate batch job though it may be possible to combine some to save wall & cpu time and tape storage.  Something to keep in mind...

We make the following event samples:

  • "beam-tri":  simply simulates the beam going though the target; the “tri” refers to trident events that are inserted at the expected rate.  This is intended to be an unbiased representation of our data
  • “tritrig-beam-tri”:  the “tritrig” refers to a sample of tridents that are preselected (at the MadGraph level) to be more likely to be in the detector acceptance and pass the trigger cuts so that we can efficiently get a large sample.  These are then have “beam-tri” overlaid in order to simulate what trident events look like in the detector.  
  • “ap” and “ap-beam-tri”:  the “ap” stands for “A prime” and is our signal MC…they can be generated at a given mass and ctau and we produce them with and w/o beam overlay (mostly with).  
  • we’ll also, in the near future, want to simulate dedicated Moller and wide angel Brem samples 

Below, I go through each generation step for each sample. 

Batch Job Submission:

 

There is a basic example on the confluence here:

https://confluence.slac.stanford.edu/display/hpsg/Example+of+using+the+JLAB+Auger+system

For official production, we run from here:

/u/group/hps/production/mc/EngRun2015Scripts

and you should be logged in as user=hps. You need to be put on the list to log in as this user, but once you are you don’t need a password; just log into JLAB unix as yourself and then ssh hps@ifarm65 (or any other machine). 

There are two parts to the job submission:  the xml template that gets parsed and runs in the batch machine and the script (currently /bin/tcsh) that manipulates the xml template pre-submission and executes the “jsub” command to submit to the batch queue. An example of the xml template is at:

/u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml

The basic format of this xml file is: 

 

<Request>

 <AugerJobParameter parameter=“blah”/>

 …  some more Auger parameters  …

 <Variable name=“internal_variable”/>

 …  some more variables  …

 <List name=“list_variable”>l1 l2 l3 … </List>

 <ForEach list=“list_variable”>

  <Job>

   <Command><![CDATA[

    …shell script that executes some stuff ...

   ]]></Command>

   <Output src=“out.slcio" dest="${out_dir}/${ebeam}/${out_file}_${num}.slcio" />

   <Stderr dest="${log_dir}/${ebeam}/${out_file}_${num}.err" />

   <Stdout dest="${log_dir}/${ebeam}/${out_file}_${num}.out" />

  </Job>

 </ForEach>

</Request>

This xml get’s executed on the batch farm, but to submit it there we use a shell script like this: 

/u/group/hps/production/mc/EngRun2015Scripts/runjob.sh
hps@ifarm1401> cat runjob.sh 
#!/bin/tcsh
if ( $#argv != 4 ) then
 echo "$0 <filename> <ebeam> <firstnum> <lastnum>"
 echo "starts 1 job per num, 1 input file per job"
 exit
endif
set nums=`seq $3 $4`
cp $1 temp.xml
set apmass=`/u/group/hps/production/mc/run_params/apmass.csh $2`
sed -i '/List .*\"num\"/{s/>.*</>'"$nums"'</}' temp.xml
sed -i '/List .*\"apmass\"/{s/>.*</>'"$apmass"'</}' temp.xml
sed -i '/Variable.*\"ebeam\"/{s/value=\".*\"/value=\"'"$2"'\"/}' temp.xml
#cat temp.xml
jsub -xml temp.xml

 

The last line, with “jsub” is the line that actually does the submission to the queue.  

A few other helper scripts that are used in the auger xml are in:  

/u/group/hps/production/mc/run_params

They are: 

  • ebeam.csh:  given the beam energy string (e.g. “1pt05”) returns the beam energy in MeV
  • dz.csh:  given the beam energy string, returns the target thickness
  • ne.csh:  given the beam energy string, returns the number of electrons per 2ns bunch
  • apmass.csh: given the beam energy string, returns as list of A’ masses to simulate

Event Generation: 

 

As mentioned, the event generation is performed via MadGraph, EGS5, or GEANT4 depending on the type.  Below lists the different type of particles produced, and some instructions on each.

 

  • beam electrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/beam.xml
    •  run with ./runjob.sh stdhep/beam.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is /u/group/hps/production/mc/egs5/beam_v3.exe
    • output is a stdhep file with each event being 1 scattered electron (I think…1 event _may_ be 1 bunch) 
  • beam hadrons:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/stdhep/hadrons.xml
    •  run with ./runjob.sh stdhep/hadrons.xml 1pt05 <firstjob> <lastjob>
    • the executable that does the event generation is a GEANT4 release that gets copied locallyorg
    • output is a stdhep file with each event being hadrons from bunch
  • unbiased tridents == “tri”:  
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tri.xml
    •  run with ./runjob.sh lhe/tri.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/trigg/run_card_<ebeam>.dat
    • output is an lhe file with each event being 1 trident
    • these files are pretty small, so after these are completed you should tar them up 100-to-1 with:
      • ./runjob_100to1.sh lhe/tri_merge_100to1.xml 1pt1 1 100
  • enhanced tridents == “tritrig”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/tritrig.xml
    •  run with ./runjob.sh lhe/tritrig.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/tritrig/run_card_<ebeam>.dat
      • the cuts here are E(e+e-)>0.8 EBeam and m(e+e-)>10MeV
    • output is an lhe file with each event being 1 trident
  • signal events ==  “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/lhe/ap-truth.xml
    •  run with ./runjob.sh lhe/ap-truth.xml 1pt05 <firstjob> <lastjob>
    •  this runs MadGraph using the  /u/group/hps/production/mc/ap/run_card_<ebeam>.dat
    • also converts the lhe to stdhep while adding displaced vertex (though now ct is hardcoded to 0 in the .xml)
    • output is an stdhep file with each event being 1 trident

 

Detector Simulation

 

The detector simulation is performed by slic, which is itself a front-end for GEANT4 which helps incorporate the specific geometry HPS uses along with putting the GEANT output into LCIO collections that are then used by the reconstruction in hps-java.  The batch scripts do more than just run slic though, particularly for making samples with beam overlay.  Below is what we do for each sample type:

 

  • beam simulation == “beam-tri”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/beam-tri_100.xml
    •  run with ./runjob_100.sh slic/beam-tri_100.xml 1pt05 <firstjob> <lastjob>
    • this submits 100 jobs, combining beam, tri, and hadron events into beam “bunches”, 1 per event
  • enhanced tridents == “tritrig”
    • auger xml: /u/group/hps/production/mc/EngRun2015Scripts/slic/tritrig.xml
    •  run with ./runjob.sh slic/tritrig.xml 1pt05 <firstjob> <lastjob>
    • runs slic over the tridents, 1 trident-per-event
  • signal events == “ap-truth”
    •  auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/slic/aptruth.xml
    •  run with ./runjob.sh slic/aptruth.xml 1pt05 <firstjob> <lastjob>
    • runs slic over signal events, 1 signal event-per-event


Readout Simulation


Simulation of the SVT and ECal digital readout is performed using the hps-java framework.


  • beam background readout == "beam-tri"
    • auger xml:  /u/group/hps/production/mc/EngRun2015Scripts/readout/ beam-tri_<trigger>_100to1.xml
      • <trigger> can be 'singles' (event occurs for each ECal cluster passing the cuts -> more events) or 'pairs' (2 clusters on opposite sides pass the cuts within a time window -> fewer events) 
    •  run with ./runjob_100to1.sh readout/beam-tri_pairs1_1_5mm_100to1.xml 1pt05 <firstjob> <lastjob>
    •  this submits 1 job, combing the simulated readout of 100 files from the previous stage into 1 output file

  • enhanced tridents with beam background== “tritrig-beam-tri”
    •  COMING SOON
  • No labels