You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

Overview

  • Conditions for Task Launching
    1. GCN Notices
    2. Level 1 (FT1/2) Data availability
    3. GRB task launcher process and DataCatalog querying
  • Individual Task Data Processing
    • GRB_blind_search
    • GRB_refinement
    • GRB_afterglow
    • DRP_monitoring
    • flareDetection
    • PGwave

Conditions for Task Launching

  1. GCN Notices arrive asynchronously via the GCN socket connection.
    • On arrival, each Notice is entered into the GCNNOTICES db table.
    • If the Notice refers to a new GRB, a GRB_ID is assigned, and a new entry in the GRB db table is created.
    • For each new entry in the GRB db table, two flags are set to false (0): L1_DATA_AVAILABLE and ANALYSIS_VERSION. L1_DATA_AVAILABLE is set to true if L1 data are available for the GRB_refinement task and the task has been submitted. ANAYLSIS_VERSION is set to true if the GRB_refinement task has run and if the L1 data are available for the GRB_afterglow task. (The names and conditions for altering fields will need to be reviewed.)
  2. When new L1 (FT1/2) data are available, ASP tasks are launched.
    • GRB_blind_search, flareDetection, and PGwave are designed to process each downlink and are launched unconditionally.
    • GRB_refinement, GRB_afterglow, and DRP_monitoring require data from specific data time intervals to be available in order to run. Presently (10/15/07), the first two are handled by the GRB task launcher process.
  3. GRB task launcher process and DataCatalog querying
    • In the BlindSearch process, a script (grb_followup.py) is run that queries the GRB db table for entries satisfying two sets of conditions:
      • L1_DATA_AVAILABLE=0: These are GRB candidates that require the GRB_refinement task to be run.
      • L1_DATA_AVAILABLE=1 and ANALYSIS_VERSION=0: These are GRB candidates that have had GRB_refinement task run, but still require GRB_afterglow to be run.
    • Based on the data in those queries, grb_followup.py computes the time intervals to be considered and queries the DataCatalog for the required FT1/2 files.
    • If those files are available, the corresponding tasks are launched.
    • GRB the table entries are updated by the launched tasks.

Individual Task Data Processing

  • GRB_blind_search
    1. The DataCatalog is queried for the FT1 file based on DownlinkId.
    2. The GRB_ASP_CONFIG db table is queried for blind search configuration: log-likelihood thresholds, event number partition size, effective deadtime between burst candidates, etc.
    3. The FT1 file is read in and analyzed.
    4. For each GRB candidate:
      • A working directory is created on NFS.
      • A LatGcnNotice is generated. A text version is written to cwd. A GCN packet version is added to the GCNNOTICES db table.
      • email notification is sent out
      • If the burst corresponds a burst already in the GCNNOTICES db table (via an algorithm TBD), an entry is made and it is marked as an "UPDATE" (ISUPDATE=1), otherwise it is marked as "NEW" (ISUPDATE=0) and a GRB db table entry is created with the candidate burst parameters (INITIAL_RA, INITIAL_DEC, INITIAL_ERR_RADIUS, MET(=GRB_ID)).
    5. grb_followup.py is executed to launch followup task for all pending GRB analyses
  • No labels