These recommendations apply to an analysis of LAT data using the Pass 8 Data (P8R3) event classification and instrument response functions ONLY. Caveats on the use of LAT data have been provided by the instrument team.
This is an updated version of the previous recommendations for analysis of LAT data using the previous Pass 7 Reprocessed event selections.
|Analysis Type||Minimum Energy
|Max Zenith Angle
|Galactic Point Source Analysis||100 (MeV)||1000000 (MeV)||90 (degrees)||128||P8R3_SOURCE_V3|
|Off-plane Point Source Analysis||100 (MeV)||1000000 (MeV)||90 (degrees)||128||P8R3_SOURCE_V3|
|Burst and Transient Analysis (<200s)||100 (MeV)||1000000 (MeV)||100 (degrees)||16||P8R3_TRANSIENT020_V3|
|Galactic Diffuse Analysis||100 (MeV)||1000000 (MeV)||90 (degrees)||128||P8R3_SOURCE_V3|
|Extra-Galactic Diffuse Analysis||100 (MeV)||1000000 (MeV)||90 (degrees)||1024||P8R3_ULTRACLEANVETO_V3 or P8R3_SOURCEVETO_V3 (when interested in E>1 GeV energy range)|
|Impulsive Solar Flare Analysis||100 (MeV)||1000000 (MeV)||100 (degrees)||65536||P8R3_TRANSIENT015S_V3|
|Analysis Type||ROI-Based Zenith Angle Cut
|Relational Filter Expression
|Galactic Point Source Analysis||no||(DATA_QUAL>0)&&(LAT_CONFIG==1)|
|Off-plane Point Source Analysis||no||(DATA_QUAL>0)&&(LAT_CONFIG==1)|
|Burst and Transient Analysis||yes||(DATA_QUAL>0)&&(LAT_CONFIG==1)|
|Galactic Diffuse Analysis||no||(DATA_QUAL>0)&&(LAT_CONFIG==1)|
|Extra-Galactic Diffuse Analysis||no||(DATA_QUAL>0)&&(LAT_CONFIG==1)|
|Burst and Transient Analysis||yes||(DATA_QUAL>0||DATA_QUAL==-1)&&(LAT_CONFIG==1)|
IMPORTANT: For analyses where an ROI-based zenith cut is NOT performed, an exposure correction must be made using the "zmax" option in the gtltcube tool.
Prior to performing any analysis on Fermi data, it is necessary to select the set of photons that are best suited for the science being performed. In most cases, the set of data selection criteria (cuts) you make will depend on the analysis being performed. Since the photon data is meant to serve for any analysis type, you will need to process the cuts before you start your analysis. There are several standard analysis types, each requiring their own sets of data cuts. These are:
There are many configurable parameters that can be used in preparing the LAT data for analysis. These parameters are inputs in various tools like gtselect (described in more detail below). Certain cuts are strongly recommended for all data analysis processes, while others are flexible, depending on the type of analysis being performed. The tables above outline the recommended cuts to be used for the event selection (gtselect) and time selection (gtmktime) appropriate for the different analysis types. Analysis of more complex sources, or confused regions, will require optimization of some of these parameters beyond the values listed below. If a parameter is not listed, then there is no specific recommended value for its use within that analysis chain.
For some analyses that require minimal contamination from Earth Limb, such as analysis of diffuse emission, it may be useful to add a cut on rocking angle (ABS(ROCK_ANGLE)<52). Users are encouraged to test their analysis cuts on control samples as well as on the ROI of the analysis. For data taken between December 6, 2013 and December 4, 2014 (the period of the Galactic-Center biased pointing strategy) a cut on rocking angle is NOT recommended.
NOTE: Once you have selected a value for a parameter within your analysis, it is strongly recommended that you do not modify that value throughout the analysis chain for that dataset. Changing such values (even for positions) could generate duplicate header keywords, causing the data file to fail processing in later steps.
Applying selection cuts to the LAT data is typically performed using gtselect, and is documented in detail within the data exploration analysis thread. Typically cuts are made to bound the dataset's time range, energy range, position, region of interest radius, and maximum zenith angle. The maximum zenith angle selection is designed to exclude time periods when any portion of the region of interest is too close to the Earth's limb, resulting in elevated background levels. The Earth's limb lies at a zenith angle of 113 degrees, so a suggested value of 90 provides protection against significant contamination by atmospheric gammas. (NOTE: This cut requires careful handling when calculating livetime, which is dicussed in detail in the Likelihood Livetime and Exposure section.).
Additionally, gtselect can be used to include/exclude events which have a higher/lower probability of being photons. The recommended class of events to use depends on the analysis in question. The LAT teams recommendations are listed in the "Event Selection Recommendations" table above. The instrument response functions used in the analysis tools must match the event classes selected.
The reconstruction and classification of LAT events is performed by the LAT instrument team as part of the Level 1 science data processing. The version of the event reconstruction algortihms used is referred to as Pass N. Currently, the LAT data pipeline uses Pass 8 (P8R3) algorithms. This processing is inherent in the science data, and cannot be changed. For a given reconstruction, the LAT team generates an accompanying set of parameterized instrument response functions (IRFs) that are designed for analysis of that particular dataset, these are labeled VN (where N is the version of the IRF). There may be several updates of IRFs for a given reconstruction, with one recommended for analysis (currently P8R3_CLASS_V3). The IRFs selected later in the analysis chain must match the data reconstruction version (currently Pass 8) and the event class selected at this stage, i.e. P8R3_SOURCE_V3 for a point source analysis. A more detailed discussion can be found in the LAT Response Functions section of the Cicerone.
The acceptance cone radius will vary with analysis type and source location. Performing a spectral analysis on a point source in the galactic plane will require a larger initial analysis region than for a source off the plane, to allow for fitting of multiple nearby sources. An acceptance cone radius of 10 deg is appropriate for spectral analysis of point sources off the Galactic plane, while 10-20 deg may be necessary for point sources located near the Galactic Plane. A timing analysis that does not use spectral modelling (for example, pulsar lightcurves) might select a smaller region to reduce background levels. These studies (timing analysis, GRB spectral analysis, extended source analysis, and others) require optimization of the acceptance cone on a case-by-case basis. Such optimization will need to consider the largest PSF of the LAT in the energy range of interest, as well as providing sufficient statistics to account for the surrounding sky structure. An all-sky analysis does not require this selection.
The spacecraft files are used to select the good time intervals (GTIs) for data analysis by using gtmktime. Selecting times when the data quality is good (DATA_QUAL>0) will exclude time periods when some spacecraft event has affected the quality of the data. gtmktime can be used to generate and combine GTIs for any logical expression of keywords in the spacecraft file. Contamination from albedo gamma rays from the Earth is reduced by applying an additional timing selection to the maximum zenith angle cut made with gtselect. Excluding time intervals when the Earth's limb intersects the selected region of interest is one method of correcting the exposure for the zenith cut. This is accomplished by using the ROIcut=yes option in gtmktime. For other exposure correction options, see the Likelihood Livetime and Exposure section.
GBM data does not typically require selection cuts prior to use. However, cuts can be applied should they be needed for a tailored analysis chain.
When using Fermi data, it is vital to recognize the limitations associated with these analyses. Each instrument team maintains a set of caveats which must be understood by the user. These are listed on the following pages, and will be kept up to date as the teams refine their understanding of the data, and develop their science processing.