Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0

...

This page was started 31 October 2006; further comments added 22 Nov 2006 by SD; a Summary of Science Tool development directions was written on 23 Jan 2007

Inquiring minds would like to know what directions the science tools will take over the year remaining before launch (or the ~2 years before tools are delivered to guest investigators). The science tools in question for this discussion are those that will be part of the standard set that will be made available to guest investigators, the 'SAE'. The idea is to define what is needed and what is desirable, and what fits within the time/person-power constraints.

...

(Julie McEnery Nov 27)-- I think that for brighter bursts we may need to account for variations in livetime fraction on timescales shorter than 30 s, or at least convince ourselves that this does not need to be done. If we don't do this, then the peak to peak flux measurements may turn out to be incorrect and bias joint spectral fits. 

Pulsar analysis

...

From Masa: "In the pulsar tools area, the major works are:
1) to implement blind search tool (A4 tool),
2) to create a new tool to plot a pulse profile (w/ periodicity test results),
3) and to introduce a method that contains the full functionality of the tool, so that a software developer (or a Python script) can call it as a part of their software." \
[He has updated the [current status page|http://glast.gsfc.nasa.gov/ssc/dev/psr_tools/status.html] for pulsar tools with these items, including some more details.\]page for pulsar tools with these items, including some more details.]

"Another major task is to develop a system to ingest and distribute pulsar ephemerides database. Other than that, we have several minor improvements in mind, such as, improving gtephcomp output, implementing more time systems, and technical refactoring for easy maintenance."

...

DLB (11/26/06)--Since strong sources may be analyzed with XSPEC, we might want to create XSPEC functions suited for ~GeV range. For example, it would be useful to have a version of the standard power law, broken power law, etc., functions normalized to a GeV. When data at a GeV is fit by functions normalized at 1 keV, the normalization is highly correlated with the other parame parameters

SAE Enhancements From the GSSC beta test

------------------------------------------These are comments compiled by Dave Davis from members of the GLAST Users Committee who participated in the GUC Beta Test in November.

DS9 regions:
gtselect should be able to use ds9 regions for selection
and the SAE tools should recognize ds9 region files.
Proposed method:
Use gtbin to make a projected map
ds9 can then be used to select and exclude regions.
Is there a library to convert .reg files --> DSS regions?
do we need to translate the projected circle --> spherical
circle?
Is this sufficient?

Default values:
gtselect should add keywords that downstream programs can use for
default values. Maybe this should be the DSS keywords if we can
determine which set to use if there are multiple dss
keywords. Alternately we might be able to use the RA_FOV and
DEC_FOV to set the field center for later programs.
1) How do other FTOOLS handle this?

  • keywords?
    2) How to impliment
  • INDEF
  • DEFAULT

The Tools need to start with reasonable defaults.
e.g. it has been suggested that for gtlikelihood
gui=yes and save output file should be on by default.
1) What are reasonable values?

  • Most FTOOLS start with the last values of when the tool
    was run and do not inherit from the previous tool but they
    do read header info.

Another way to make the imputs more reasonable" is to make them
more compact so that the user can reuse parts of the command
file. One method would be to use the fselect expression
interface. This would allow queries like
"binary && mag <= 5.0" or to be more glast specific
"energy > 100. && ZENITH_ANGLE <20."
This also allows one to use the region file format
regfilter("region.reg", XPOS, YPOS)
and it allows flexible gti selections.
gtifilter( "", TIME, "START", "STOP" )

...

Names should be shorter where possible
gtlivetimecube -> gtltc , gtexp ...?
1) suggestions for names?
What should we consider tool long (> 8 characters)
2) Links for the programs

  • How to handle parameter names

lightcurve analysis needs to be revisited and mapped out.
how to use exposure corrected data
how to use regions to exclude nearby sources.
Develop threads
1) What has already been done
quicklook analysis
publication quality analysis
2) Can we adapt existing scripts?

  • evaluation the amount of work.

map analysis:
psf generation for a given source and spectral fit.
1) compare with source distribution
2) PSF subtraction
3) image restoration/deconvolution
Issues:
How to use the fitted parameters from gtlikelihood or Xsepc
a) read the xml fit parameters or the xspec fit parameters
b) convert to some fits format?
Need both 1-d psf to compare radial distribution and a 2-d psf
for psf subtraction/deconvolution

...