Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Proposed resolution: Include EVENT_CLASS as Jim describes in place of CONVERSION_LAYER and as a way to encode other classes that we come up with. After discussion with Jim, I think we should also have a column called something like CONVERSION that would indicate, e.g., by value 1,2, or 3 whether the event is FRONT, BACK, or CAL-ONLY. This would be in addition to EVENT_CLASS, although until we have other ways defined to discriminate other kinds of response functions it will have the same value.

...

Here are a few more issues that have arisen in the run-up to Ground Readiness Test 5, which has the ISOC transferring a realistic FT1 file to the GSSC, who will verify that it is consistent in every testable way with the detailed specifications in the Science Data Products File Format Document (Word file). The FFD has not been finalized and several loose ends have been recognized regarding the specification of the FT1 headers and columns.

20. Precision of time in TSTART/TSTOP

From a note by Tom Stephens:

Wiki Markup
{bgcolor:blue}
An interesting question came up as we were working on testing our next software release concerning the precision of the values in the FITS file headers related to time.  In the data tables of all the files,  columns related to time are stored as double precision which gives us 15 significant digits (microsecond precision) which is the requirement.  I've never seen any such specification for the keyword values in the headers.  (Although I would assume it should be the same.)  The problem has to do with rounding.  Here's what we found.

As part of our data ingest system, we validate the incoming FITS files.  One of those checks is to make sure that the time values in the tables all fall within the specified TSTART and TSTOP values given in the header.  In several files we were seeing things like the following:

In the header we have:
TSTART  =   1.540365873394E+08   => 154036587.3394

In the data we have
TIME    =   1.54036587339366E+08 => 154036587.339366

which is the same if we round the data to the ten thousandths of a second but since both are read into a double precision variable, a simple comparison causes the file to fail verification since the data time is earlier than TSTART.  Effectively the files are correct to the level or precision in the header but not to the level of precision in the data.

The GSSC can only check the files to the level of precision in the header keywords, so my question is what level of precision should we have there?  It doesn't really matter to me what the answer is, but we need to know to develop the software and I think it should be consistent across all the data files created by both LAT and GBM.
{bgcolor}

(Digel) The consensus seems to be just to increase the number of decimal places in the ASCII representations of TSTART and TSTOP in the header to reach the microsecond level. This is probably good enough, although we are right at the limit of precision of doubles and with floating point representations, if a comparison comes down to the last digit of precision, it can be hit or miss.

Actually, the particular example that Tom cited was from an FT1 file that was generated before we took care about setting TSTART and TSTOP properly. makeFT1 was given no information about those values and so just used the actual times of the first and last events in the file. In DC2 we set the TSTART and TSTOP values to be integal numbers of seconds, but even in the flight data when runs and downlinks start and end at whatever times they do, we'll only rarely have an event within a microsecond of TSTART or TSTOP for a given downlink or run.

Proposed resolution: Modify makeFT1 to write TSTART and TSTOP to 6 decimal places.