Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

DLB (11/26/06)--Since strong sources may be analyzed with XSPEC, we might want to create XSPEC functions suited for ~GeV range. For example, it would be useful to have a version of the standard power law, broken power law, etc., functions normalized to a GeV. When data at a GeV is fit by functions normalized at 1 keV, the normalization is highly correlated with the other parame SAE Enhancements From the GSSC beta test
------------------------------------------

DS9 regions:
gtselect should be able to use ds9 regions for selection
and the SAE tools should recognize ds9 region files.
Proposed method:
Use gtbin to make a projected map
ds9 can then be used to select and exclude regions.
Is there a library to convert .reg files --> DSS regions?
do we need to translate the projected circle --> spherical
circle?
Is this sufficient?

Default values:
gtselect should add keywords that downstream programs can use for
default values. Maybe this should be the DSS keywords if we can
determine which set to use if there are multiple dss
keywords. Alternately we might be able to use the RA_FOV and
DEC_FOV to set the field center for later programs.
1) How do other FTOOLS handle this?

  • keywords?
    2) How to impliment
  • INDEF
  • DEFAULT

The Tools need to start with reasonable defaults.
e.g. it has been suggested that for gtlikelihood
gui=yes and save output file should be on by default.
1) What are reasonable values?

  • Most FTOOLS start with the last values of when the tool
    was run and do not inherit from the previous tool but they
    do read header info.

Another way to make the imputs more reasonable" is to make them
more compact so that the user can reuse parts of the command
file. One method would be to use the fselect expression
interface. This would allow queries like
"binary && mag <= 5.0" or to be more glast specific
"energy > 100. && ZENITH_ANGLE <20."
This also allows one to use the region file format
regfilter("region.reg", XPOS, YPOS)
and it allows flexible gti selections.
gtifilter( "", TIME, "START", "STOP" )

Parameters names should be consistent between the tools. This should
include the GUI.
1) Who should do this and how should it be split up
2) Present this at a science tools meeting

Names should be shorter where possible
gtlivetimecube -> gtltc , gtexp ...?
1) suggestions for names?
What should we consider tool long (> 8 characters)
2) Links for the programs

  • How to handle parameter names

lightcurve analysis needs to be revisited and mapped out.
how to use exposure corrected data
how to use regions to exclude nearby sources.
Develop threads
1) What has already been done
quicklook analysis
publication quality analysis
2) Can we adapt existing scripts?

  • evaluation the amount of work.

map analysis:
psf generation for a given source and spectral fit.
1) compare with source distribution
2) PSF subtraction
3) image restoration/deconvolution
Issues:
How to use the fitted parameters from gtlikelihood or Xsepc
a) read the xml fit parameters or the xspec fit parameters
b) convert to some fits format?
Need both 1-d psf to compare radial distribution and a 2-d psf
for psf subtraction/deconvolution

pulsar analysis:
Energy dependent and possibly PSF dependent cuts.
Overwriting the TIME column for the tools is not
optimal. (Change name? TIME_ORIG)

All threads need to be brought up to date.

Reference documentation needs significant updates.

GUI interface for the tools.
This is rather overarching.
1) what tools need to be in the GUI
2) What type of GUI
root gui's
xselect type (or even use xselect?)

The current tools GUI's needs to be refined. E.g. the save and "should I overwrite"
prompts need to be clearer.

XSPEC-like analysis should be explored.
The ability to get a Counts or "pseudo-counts" spectrum with an
appropriate rsp matrix would facilitate multi-mission analysis.