- Determine *when* the runs were taken. You'll need the day number of the year. One method is:
Copy the data from
python >>> from datetime import * >>> datetime(<yyyy>, <mm>, <dd>).toordinal() - datetime(<yyyy>, 1, 1).toordinal() + 1
/gnfs/data/DVD/RawPackets/2006<day of year>/
to a local directory, say~/sci
. (It is probably best to use the glast account for this to ensure that you are working with the correct environment.) The date and time found in theeLog
runs section is the same as used in the.../RawArchive/*
directories. Only the*_SCI.pkt
files that span the run of interest are needed. mkdir ~/sci/lsf
- Using the
glast
account on alat-licos*
or equivalent (i.e., a machine that hasLICOS_Scripts
installed):Determine which of the output files is the run you want. File format is:python $ONLINE_ROOT/LICOS_Scripts/analysis/LsfWriter.py --lsd_dir=~/sci --lsf_dir=~/sci/lsf
<ApId>-<GroundId>-<Time>-<nFirst>-<nEnd>.lsf
GroundId
is the run number, without the 077000000, in hex.
Verify that nFirst
is 0000.
- Login to lat-dmz01 and enter the following to connect to the MySQL database:
(enter the password at the prompt)
mysql --user elogbook elogbook --password
- Enter the following command to reset the analysis state:
(replace 999999999 with the acquire run id)
update LICOS_activities set JobCompleteList = '[None]', AnalysisState_FK = 7 where RunID_FK = '999999999';
- Copy the
lsf
file(s) to/nfs/data/lsf
. If the corresponding DAQ run had multiple steps, there will be multiplelsf
files. AnalysisEngine
will see them and should be able to reprocess the run.