- Determine *when* the runs were taken. You'll need the day number of the year. One method is:
Copy the data fromNo Format python >>> from datetime import * >>> datetime(<yyyy>, <mm>, <dd>).toordinal() - datetime(<yyyy>, 1, 1).toordinal() + 1
/gnfs/data/DVD/RawPackets/2006<DDD>/
to a local directory, say~/sci
(whereDDD
is the day of year from above). The date and time found in theeLog
runs section is the same as used in the.../RawArchive/*
directories. Only the*_SCI.pkt
files that span the run of interest are needed. mkdir ~/sci/lsf
- Using the
glast
account on alat-licos*
or equivalent (i.e., a machine that hasLICOS_Scripts
installed):
Determine which of the output files is the run you want. File format is:No Format python $ONLINE_ROOT/LICOS_Scripts/analysis/LsfWriter.py --lsd_dir=~/sci --lsf_dir=~/sci/lsf
No Format <ApId>-<GroundId>-<Time>-<nFirst>-<nEnd>.lsf
GroundId
is the run number, without the 077000000, in hex. Verify thatnFirst
is 0000.- Login to lat-dmz01 and enter the following to connect to the MySQL database:
(enter the password at the prompt)No Format mysql --user elogbook elogbook --password
- Enter the following command to reset the analysis state:
(replace 077RRRRRR with the acquire run id)No Format update LICOS_activities set JobCompleteList = '[None]', AnalysisState_FK = 7 where RunID_FK = '077RRRRRR';
- Copy the
lsf
file(s) to "/nfs/data/lsf
/.". If the corresponding DAQ run had multiple steps, there will be multiplelsf
files. AnalysisEngine
will see them and should be able to reprocess the run.