- Determine *when* the runs were taken. You'll need the day number of the year. One method is:
Copy the data from
python >>> from datetime import * >>> datetime(<yyyy>, <mm>, <dd>).toordinal() - datetime(<yyyy>, 1, 1).toordinal() + 1
/gnfs/data/DVD/RawPackets/2006<DDD>/
to a local directory, say~/sci
(whereDDD
is the day of year from above). The date and time found in theeLog
runs section is the same as used in the.../RawArchive/*
directories. Only the*_SCI.pkt
files that span the run of interest are needed. mkdir ~/sci/lsf
- Using the
glast
account on alat-licos*
or equivalent (i.e., a machine that hasLICOS_Scripts
installed):Determine which of the output files is the run you want. File format is:python $ONLINE_ROOT/LICOS_Scripts/analysis/LsfWriter.py --lsd_dir=~/sci --lsf_dir=~/sci/lsf
<ApId>-<GroundId>-<Time>-<nFirst>-<nEnd>.lsf
GroundId
is the run number, without the 077000000, in hex. Verify thatnFirst
is 0000.- Login to lat-dmz01 and enter the following to connect to the MySQL database:
(enter the password at the prompt)
mysql --user elogbook elogbook --password
- Enter the following command to reset the analysis state:
(replace 077RRRRRR with the acquire run id)
update LICOS_activities set JobCompleteList = '[None]', AnalysisState_FK = 7 where RunID_FK = '077RRRRRR';
- Copy the
lsf
file(s) to "/nfs/data/lsf
/.". If the corresponding DAQ run had multiple steps, there will be multiplelsf
files. AnalysisEngine
will see them and should be able to reprocess the run.
4 Comments
Philip Hart
Selim notes:
to copy the database record from lat-dmz01 to lat-hobbit5, here is what is needed:
On lat-dmz01, connect to mysql with the user elogbookadmin:
mysql --user elogbookadmin elogbook --password
mysql> select * from LICOS_activities where RunID_FK = 77010481 into outfile '/tmp/077010481.txt';
scp the file /tmp/077010481.txt to glastlnx06.
Login to lat-hobbit5 and connect to mysql the same way as above and enter the following SQL command:
load data local infile '077010481.txt' into table LICOS_activities;
If needed update the SQL record as stated in the confluence page.
Philip Hart
Latest method of resurrecting run data, via Bryson:for e.g. 15459, go to I&T "rawData" directory /nfs/farm/g/glast/u40/Integration/rawData/077015459, find the RetDef-padded.xml file,
and execute "getEvents.exe RetDef-padded.xml 077015459.evt" in a NIGHTLY_DEVEL environment.
Philip Hart
If the MCR is unavailable, the LICOS_activities table can be accessed at SLAC by
mysql --user elogbookadmin elogbook --password -h glastdb
give or take some format issues.
Philip Hart
For I&T/MOC storage issues:
PktDump.py --run 77015783 -p 956 -p 957 -p 958 -o 15783.pkt -a /nfs/farm/g/glast/u42/ISOC/Archive/level0/