Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0

...

This page was started 31 October 2006; further comments added 22 Nov 2006 by SD; a Summary of Science Tool development directions was written on 23 Jan 2007

Inquiring minds would like to know what directions the science tools will take over the year remaining before launch (or the ~2 years before tools are delivered to guest investigators). The science tools in question for this discussion are those that will be part of the standard set that will be made available to guest investigators, the 'SAE'. The idea is to define what is needed and what is desirable, and what fits within the time/person-power constraints.

...

Jim Chiang (11/29/06)--With regard to handling pointed observations, generalizing the Psf integrals to include a time-dependent zenith angle cut may turn out to be computationally intractable. As an alternative, Julie suggested making a cut on acceptance as a function of inclination angle wrt to the instrument z-axis. This would be straight-forward to implement in the exposure calculations, except that the cuts on the events would be applied to the measured inclinations, whereas our IRFs are defined as a function of trueinclination. An alternative that would not make this sort of approximation can actually be applied within the current set of ScienceTools: Define a set of GTIs such that the instrument z-axis is within a specified angle (say 115 degrees) of the spacecraft zenith. For pointed mode, this will likely decrease the on-source exposure time substantially, but this is a procedure that can be applied now.

...

(Julie McEnery Nov 27)-- I think that for brighter bursts we may need to account for variations in livetime fraction on timescales shorter than 30 s, or at least convince ourselves that this does not need to be done. If we don't do this, then the peak to peak flux measurements may turn out to be incorrect and bias joint spectral fits. 

Pulsar analysis

...

From Masa: "In the pulsar tools area, the major works are:
1) to implement blind search tool (A4 tool),
2) to create a new tool to plot a pulse profile (w/ periodicity test results),
3) and to introduce a method that contains the full functionality of the tool, so that a software developer (or a Python script) can call it as a part of their software." \
[He has updated the [current status page|http://glast.gsfc.nasa.gov/ssc/dev/psr_tools/status.html] for pulsar tools with these items, including some more details.\page for pulsar tools with these items, including some more details.]

"Another major task is to develop a system to ingest and distribute pulsar ephemerides database. Other than that, we have several minor improvements in mind, such as, improving gtephcomp output, implementing more time systems, and technical refactoring for easy maintenance."

...

Jim Chiang (11/29/06)--For my DC2 analysis of the Solar flare, I wrote an exposure tool that does such an exposure correction as a function of time at a specific location on the sky. For the Solar flare, I made a comparison of the light curve obtained using gtbin and the output of this exposure tool versus the flux estimates from a full likelihood fit of the fluxes. At least for bright sources such as the DC2 Solar flare, the comparison was quite favorable. See slides 8 and 9 of my DC2 close-out talk. I will make this exposure tool into a proper ScienceTool.

...

DLB (11/26/06)--Since strong sources may be analyzed with XSPEC, we might want to create XSPEC functions suited for ~GeV range. For example, it would be useful to have a version of the standard power law, broken power law, etc., functions normalized to a GeV. When data at a GeV is fit by functions normalized at 1 keV, the normalization is highly correlated with the other parameters

SAE Enhancements From the GSSC beta test

These are comments compiled by Dave Davis from members of the GLAST Users Committee who participated in the GUC Beta Test in November.

DS9 regions:
gtselect should be able to use ds9 regions for selection
and the SAE tools should recognize ds9 region files.
Proposed method:
Use gtbin to make a projected map
ds9 can then be used to select and exclude regions.
Is there a library to convert .reg files --> DSS regions?
do we need to translate the projected circle --> spherical
circle?
Is this sufficient?

Default values:
gtselect should add keywords that downstream programs can use for
default values. Maybe this should be the DSS keywords if we can
determine which set to use if there are multiple dss
keywords. Alternately we might be able to use the RA_FOV and
DEC_FOV to set the field center for later programs.
1) How do other FTOOLS handle this?

  • keywords?
    2) How to impliment
  • INDEF
  • DEFAULT

The Tools need to start with reasonable defaults.
e.g. it has been suggested that for gtlikelihood
gui=yes and save output file should be on by default.
1) What are reasonable values?

  • Most FTOOLS start with the last values of when the tool
    was run and do not inherit from the previous tool but they
    do read header info.

Another way to make the imputs more reasonable" is to make them
more compact so that the user can reuse parts of the command
file. One method would be to use the fselect expression
interface. This would allow queries like
"binary && mag <= 5.0" or to be more glast specific
"energy > 100. && ZENITH_ANGLE <20."
This also allows one to use the region file format
regfilter("region.reg", XPOS, YPOS)
and it allows flexible gti selections.
gtifilter( "", TIME, "START", "STOP" )

Parameters names should be consistent between the tools. This should
include the GUI.
1) Who should do this and how should it be split up
2) Present this at a science tools meeting

Names should be shorter where possible
gtlivetimecube -> gtltc , gtexp ...?
1) suggestions for names?
What should we consider tool long (> 8 characters)
2) Links for the programs

  • How to handle parameter names

lightcurve analysis needs to be revisited and mapped out.
how to use exposure corrected data
how to use regions to exclude nearby sources.
Develop threads
1) What has already been done
quicklook analysis
publication quality analysis
2) Can we adapt existing scripts?

  • evaluation the amount of work.

map analysis:
psf generation for a given source and spectral fit.
1) compare with source distribution
2) PSF subtraction
3) image restoration/deconvolution
Issues:
How to use the fitted parameters from gtlikelihood or Xsepc
a) read the xml fit parameters or the xspec fit parameters
b) convert to some fits format?
Need both 1-d psf to compare radial distribution and a 2-d psf
for psf subtraction/deconvolution

pulsar analysis:
Energy dependent and possibly PSF dependent cuts.
Overwriting the TIME column for the tools is not
optimal. (Change name? TIME_ORIG)

All threads need to be brought up to date.

Reference documentation needs significant updates.

GUI interface for the tools.
This is rather overarching.
1) what tools need to be in the GUI
2) What type of GUI
root gui's
xselect type (or even use xselect?)

The current tools GUI's needs to be refined. E.g. the save and "should I overwrite"
prompts need to be clearer.

XSPEC-like analysis should be explored.
The ability to get a Counts or "pseudo-counts" spectrum with an
appropriate rsp matrix would facilitate multi-mission analysis.