1. Determine *when* the runs were taken.  You'll need the day number of the year.  One method is:
    >>> from datetime import *
    >>> datetime(<yyyy>, <mm>, <dd>).toordinal() - datetime(<yyyy>, 1, 1).toordinal() + 1
    Copy the data from /gnfs/data/DVD/RawPackets/2006<DDD>/ to a local directory, say ~/sci (where DDD is the day of year from above).  The date and time found in the eLog runs section is the same as used in the .../RawArchive/* directories.  Only the *_SCI.pkt files that span the run of interest are needed.
  2. mkdir ~/sci/lsf
  3. Using the glast account on a lat-licos* or equivalent (i.e., a machine that has LICOS_Scripts installed):
    python $ONLINE_ROOT/LICOS_Scripts/analysis/LsfWriter.py --lsd_dir=~/sci --lsf_dir=~/sci/lsf
    Determine which of the output files is the run you want.  File format is:
  4. <ApId>-<GroundId>-<Time>-<nFirst>-<nEnd>.lsf
    GroundId is the run number, without the 077000000, in hex.  Verify that nFirst is 0000. 
  5. Login to lat-dmz01 and enter the following to connect to the MySQL database:
    mysql --user elogbook elogbook --password
    (enter the password at the prompt)
  6. Enter the following command to reset the analysis state:
    update LICOS_activities set JobCompleteList = '[None]', AnalysisState_FK = 7 where RunID_FK = '077RRRRRR';
    (replace 077RRRRRR with the acquire run id)
  7. Copy the lsf file(s) to "/nfs/data/lsf/.".  If the corresponding DAQ run had multiple steps, there will be multiple lsf files.
  8. AnalysisEngine will see them and should be able to reprocess the run.
  • No labels


  1. Selim notes:

    to copy the database record from lat-dmz01 to lat-hobbit5, here is what is needed:

    On lat-dmz01, connect to mysql with the user elogbookadmin:

    mysql --user elogbookadmin elogbook --password

    mysql> select * from LICOS_activities where RunID_FK = 77010481 into outfile '/tmp/077010481.txt';

    scp the file /tmp/077010481.txt to glastlnx06.

    Login to lat-hobbit5 and connect to mysql the same way as above and enter the following SQL command:

    load data local infile '077010481.txt' into table LICOS_activities;

    If needed update the SQL record as stated in the confluence page.

  2. Latest method of resurrecting run data, via Bryson:for e.g. 15459, go to I&T "rawData" directory /nfs/farm/g/glast/u40/Integration/rawData/077015459, find the RetDef-padded.xml file,
    and execute "getEvents.exe RetDef-padded.xml 077015459.evt" in a NIGHTLY_DEVEL environment.

  3. If the MCR is unavailable, the LICOS_activities table can be accessed at SLAC by

    mysql --user elogbookadmin elogbook --password -h glastdb

     give or take some format issues.

  4. For I&T/MOC storage issues:

    PktDump.py --run 77015783 -p 956 -p 957 -p 958 -o 15783.pkt -a /nfs/farm/g/glast/u42/ISOC/Archive/level0/