Fermi Gamma-ray Space Telescope

LAT Data Selection Recommendations

These recommendations apply to an analysis of LAT data using the Pass 7 Reprocessed Data event classification and instrument response functions ONLY. Caveats on the use of LAT data have been provided by the instrument team.

This is an updated version of the recommendations for analysis of LAT data using the previous P6 event selections.

Event Selection Recommendations (P7)

Analysis Type Minimum Energy
(emin)
Maximum Energy
(emax)
Max Zenith Angle
(zmax)
Event Class
(evclass)
IRF Name
Galactic Point Source Analysis 100 (MeV) - 100(degrees) 2 P7REP_SOURCE_V15
Off-plane Point Source Analysis 100 (MeV) - 100 (degrees) 2 P7REP_SOURCE_V15
Burst and Transient Analysis (<200s) 100 (MeV) - 100 (degrees) 0 P7REP_TRANSIENT_V15
Galactic Diffuse Analysis 100 (MeV) 100000 (MeV) 100 (degrees) 2 P7REP_SOURCE_V15
Extra-Galactic Diffuse Analysis 100 (MeV) 100000 (MeV) 100 (degrees) 3 P7REP_CLEAN_V15

Time Selection Recommendations

Analysis Type ROI-Based Zenith Angle Cut
(roicut)
Relational Filter Expression
(filter)
Galactic Point Source Analysis yes (DATA_QUAL>0)&&(LAT_CONFIG==1)
Off-plane Point Source Analysis yes (DATA_QUAL>0)&&(LAT_CONFIG==1)
Burst and Transient Analysis yes (DATA_QUAL>0)&&(LAT_CONFIG==1)
Galactic Diffuse Analysis no** (DATA_QUAL>0)&&(LAT_CONFIG==1)
Extra-Galactic Diffuse Analysis no** (DATA_QUAL>0)&&(LAT_CONFIG==1)

** An exposure correction must be made using the zmax option in gtltcube

Prior to performing any analysis on Fermi data, it is necessary to select the set of photons that are best suited for the science being performed. In most cases, the set of data selection criteria (cuts) you make will depend on the analysis being performed. Since the photon data is meant to serve for any anaylsis type, you will need to process the cuts before you start your analysis. There are several standard analysis types, each requiring their own sets of data cuts. These are:

  • Point Source Spectral analysis
  • Burst and Transient analysis (for timescales < 200 seconds)
  • Timing analysis (pulsar phase, GRB temporal analysis, etc.)
  • Galactic Diffuse analysis
  • Extragalactic Diffuse analysis (Analyses using most of or all the sky or requiring minimal non-photon contamination).

There are many configurable parameters that can be used in preparing the LAT data for analysis. These parameters are inputs in various tools like gtselect (described in more detail below). Certain cuts are strongly recommended for all data analysis processes, while others are flexible, depending on the type of analysis being performed. The tables above outline the recommended cuts to be used for the event selection (gtselect) and time selection (gtmktime) appropriate for the different analysis types. Analysis of more complex sources, or confused regions, will require optimization of some of these parameters beyond the values listed below. If a parameter is not listed, then there is no specific recommended value for its use within that analysis chain.

For some analyses that require minimal contamination from Earth Limb, such as analysis of diffuse emission, it may be useful to add a cut on rocking angle (ABS(ROCK_ANGLE)<52). Users are encouraged to test their analysis cuts on control samples as well as on the ROI of the analysis. For data taken before December 6, 2013 (the date of the switch to the new observing profile) it is safe to continue using the cut on rocking angle as was previously recommended.

NOTE: Once you have selected a value for a parameter within your analysis, it is strongly recommended that you do not modify that value throughout the analysis chain for that dataset. Changing such values (even for positions) could generate duplicate header keywords, causing the data file to fail processing in later steps.

Preparation of data for analysis

Making cuts using the events file keywords

Applying selection cuts to the LAT data is typically performed using gtselect, and is documented in detail within the data exploration analysis thread. Typically cuts are made to bound the dataset's time range, energy range, position, region of interest radius, and maximum zenith angle. The maximum zenith angle selection is designed to exclude time periods when any portion of the region of interest is too close to the Earth's limb, resulting in elevated background levels. The Earth's limb lies at a zenith angle of 113 degrees, so a suggested value of 100 provides protection against significant contamination by atmospheric gammas. (NOTE: This cut requires careful handling when calculating livetime, which is dicussed in detail in the Likelihood Livetime and Exposure section.).

Additionally, gtselect can be used to include/exclude events which have a lower probability of being photons (event classes 0, 2 or 3 - event class 1 is not used). The recommended class of photons to use depends on the analysis in question. The LAT teams recommendations are listed in the "Event Selection Recommendations" table above.. The instrument response functions used in the analysis tools must match the event classes selected.

The reconstruction and classification of LAT events is performed by the LAT instrument team as part of the Level 1 science data processing. The version of the event reconstruction algortihms used is referred to as Pass N. Currently, the LAT data pipeline uses Pass 7 reconstruction algorithms. This processing is inherent in the science data, and cannot be changed. For a given reconstruction, the LAT team generates an accompanying set of parameterized instrument response functions (IRFs) that are designed for analysis of that particular dataset, these are labeled VN (where N is the version of the IRF). There may be several updates of IRFs for a given reconstruction, with one recommended for analysis (currently P7REP_V15). The IRFs selected later in the analysis chain must match the data reconstruction version (currently Pass 7 reprocessed) and the event class selected at this stage, i.e. P7REP_SOURCE_V15 for a point source analysis. A more detailed discussion can be found in the LAT Response Functions section of the Cicerone.

The acceptance cone radius will vary with analysis type and source location. Performing a spectral analysis on a point source in the galactic plane will require a larger initial analysis region than for a source off the plane, to allow for fitting of multiple nearby sources. An acceptance cone radius of 10 deg is appropriate for spectral analysis of point sources off the Galactic plane, while 10-20 deg may be necessary for point sources located near the Galactic Plane. A timing analysis that does not use spectral modelling (for example, pulsar lightcurves) might select a smaller region to reduce background levels. These studies (timing analysis, GRB spectral analysis, extended source analysis, and others) require optimization of the acceptance cone on a case-by-case basis. Such optimization will need to consider the largest PSF of the LAT in the energy range of interest, as well as providing sufficient statistics to account for the surrounding sky structure. An all-sky analysis does not require this selection.

Making cuts using the spacecraft file keywords

The spacecraft files are used to select the good time intervals (GTIs) for data analysis by using gtmktime. Selecting times when the data quality is good (DATA_QUAL>0) will exclude time periods when some spacecraft event has affected the quality of the data. gtmktime can be used to generate and combine GTIs for any logical expression of keywords in the spacecraft file. Contamination from albedo gamma rays from the Earth is reduced by applying an additional timing selection to the maximum zenith angle cut made with gtselect. Excluding time intervals when the Earth's limb intersects the selected region of interest is one method of correcting the exposure for the zenith cut. This is accomplished by using the ROIcut=yes option in gtmktime. For other exposure correction options, see the Likelihood Livetime and Exposure section.

GBM data does not typically require selection cuts prior to use. However, cuts can be applied should they be needed for a tailored analysis chain.

Limitations for Data Use

When using Fermi data, it is vital to recognize the limitations associated with these analyses. Each instrument team maintains a set of caveats which must be understood by the user. These are listed on the following pages, and will be kept up to date as the teams refine their understanding of the data, and develop their science procesing.

» GBM Data Caveats
» LAT Data Caveats


» Forward to Lightcurves
» Back to Overview of Data Exploration
» Back to the beginning of Data Exploration
» Back to the beginning of the Cicerone