Fermi Gamma-ray Space Telescope

Using Fermi-LAT All-sky Weekly Files

Some types of data analyses lend themselves to the use of all-sky data files, rather than repeatedly downloading overlapping regions of sky for repeat analysis. Examples are:

  • Light curves and spectra for a large number of sources, whether galactic, extragalactic, or both
  • Characterization of large-scale diffuse emission
  • Analysis of moving sources, like the Sun
  • Searches for transient events (like GRBs) that have an isotropic distribution

This analysis thread describes how to generate an all-sky data set for use in other types of analysis. We will then create an exposure-corrected all-sky image as an example of how to use the weekly files.

You can download this tutorial as a Jupyter notebook and run it interactively. Please see the instructions for using the notebooks with the Fermitools.

Download the weekly LAT data files

The weekly LAT data files are available from the FSSC FTP Server. These files contain the LAT data acquired during each week of Fermi's science mission. (Mission weeks run Thursday through Wednesday, UTC time.) The file for the current mission week will also be listed, even though the week may not yet be complete. The current mission week file is regenerated by the FSSC after every data delivery from the LAT instrument team. This guarantees that the FTP Server and the LAT data server contain the same data, and both are up-to-date.

You can download the LAT weekly files one at a time from the FTP site. Or you can use wget to download a full set of files. Once you have the current set of files, you can use the same wget command (from the same directory) to download only the newest files. In the event that the LAT team reprocesses data for the full mission, the filenames will change, and wget will download a full set of the new, reprocessed, data.

The command to download the full set of weekly photon files is:

> wget -m -P . -nH --cut-dirs=4 -np -e robots=off https://heasarc.gsfc.nasa.gov/FTP/fermi/data/lat/weekly/photon/

Weekly LAT data files are also available with diffuse responses pre-computed, as calculating diffuse responses for unbinned likelihood analysis can be computationally intensive. These files can be downloaded from the following link. For more information on these please see here.

The command to download the full set of weekly photon files with pre-computed diffuse response is:

> wget -m -P . -nH --cut-dirs=4 -np -e robots=off https://heasarc.gsfc.nasa.gov/FTP/fermi/data/lat/weekly/diffuse/

LAT data has been classified based on the quality of the event reconstruction. The classification can then be used to reduce background and improve performance by filtering out poorly reconstructed events. With the advent of Pass 8 data reconstruction, there are a myriad of various event classes that are tailored for specific types of analysis. The user should see the Cicerone pages for a full description of the various event classes (and types) available in Pass 8. For most analyses the user will be interested in SOURCE class photons (evclass=128). However, analyses that are intrinsically signal limited require very low background. For example, a user studying large scale, diffuse structures may be interested in using the CLEAN (evclass=256) or even ULTRACLEAN (evclass=512) classes.

There are two types of weekly files available; photon files and extended files. The photon weekly files contain all the SOURCE class and are usable for most analyses. The extended files contain the same events as the photon files, but also contain additional reconstruction information about each photon that may be useful if you are trying to characterize the quality of a particular signal, such as for dark matter line searches.

NOTE: For analysis of short timescale events (for example, GRBs) where the flux of the event is much brighter than the accrued background, the user may be interested in using the TRANSIENT010 (evclass=64) event class. If you are interested in investigating transient events within LAT data, you cannot use the weekly files, as they have been prefiltered to contain SOURCE class photons. There are no weekly files that contain the TRANSIENT class events.
In order to generate an all-sky dataset of TRANSIENT class events, you will need to perform a grid of data server queries with the maximum radius of 60 degrees. Once you have downloaded data from the full sky, you will need to combine the files and filter out duplicate events. We recommend using the FTOOLS suite to perform these tasks.

The command to download the extended weekly files is:

> wget -m -P . -nH --cut-dirs=4 -np -e robots=off https://heasarc.gsfc.nasa.gov/FTP/fermi/data/lat/weekly/extended/

In order to do analysis for the full mission, you will also need the mission-long spacecraft file. This file contains an entries at 30-second intervals during all periods of active data-taking by the LAT instrument. As a result, this file can be rather large. There are also weekly spacecraft files that contain the same information. However, some users have found the pre-generated full mission file to be more robust during analysis than a concatenated set of weekly files. As a result, we recommend re-downloading the full mission file when you update your all-sky data set.

The command to download the mission spacecraft file is:

> wget -m -P . -nH --cut-dirs=4 -np -e robots=off https://heasarc.gsfc.nasa.gov/FTP/fermi/data/lat/mission/spacecraft/

It may be useful to rename the spacecraft file to "spacecraft.fits" for ease of use in other analyses.

Combine the data files

You should now have a full set of weekly files and a mission-long spacecraft file. The next step is to combine the weekly files into a single file. We will use the gtselect tool, which can also be used for filtering data. But we want to combine the full data set without removing any events, so we will use slightly different parameters. To combine files, we first create a file list:

> ls lat_photon_weekly* > filelist.txt

If you look at filelist.txt, you will see it is simply a list of all the files in the directory that start with the string "lat_photon_weekly". You will also want to clear any parameters in gtselect from previous analysis that could inadvertently remove events from the dataset. Clear the gtselect parameter file back to defaults with the command:

> punlearn gtselect

Now you're ready to run gtselect to combine the data files. Here we will use INDEF so that gtselect will not make a selection on event class or event type. See the gtselect help file for a description of the evclass and evtype cuts.

> gtselect evclass=INDEF evtype=INDEF
Input FT1 file [] @filelist.txt
Output FT1 file [] lat_alldata.fits
RA for new search center (degrees) (0:360) [] 0
Dec for new search center (degrees) (-90:90) [] 0
radius of new search region (degrees) (0:180) [] 180
start time (MET in s) (0:) [INDEF] INDEF
end time (MET in s) (0:) [INDEF] INDEF
lower energy limit (MeV) (0:) [100] 30
upper energy limit (MeV) (0:) [300000] 1000000
maximum zenith angle value (degrees) (0:180) [] 180

Combining the files takes 10-15 minutes for the full dataset.

Now that you have mission-long data and spacecraft files, you are ready to generate AGN light curves, search for flaring sources, or perform any other type of analysis that requires using data from many regions on the sky. A simple example is described below.


Using the LAT data to produce an all-sky image

Step 1: Remove limb photons

First, filter the data for the proper event class and to remove the earth limb contamination. We will use the SOURCE class (evclass=128), as it has been tuned to balance statistics with background for long-duration point source analysis. Here we also include the evtype=3 cut to use both front and back converting event types. We will remove contamination by limiting the reconstructed zenith angle to events at an angle of 90° or less. We will also improve the PSF by excluding events with reconstructed energies below 1 GeV. Run gtselect to filter the data and apply a zenith cut.

> gtselect evclass=128 evtype=3
Input FT1 file[] lat_alldata.fits
Output FT1 file[] lat_source_zmax90_gt1gev.fits
RA for new search center (degrees) (0:360) [] 0
Dec for new search center (degrees) (-90:90) [] 0
radius of new search region (degrees) (0:180) [] 180
start time (MET in s) (0:) [] INDEF
end time (MET in s) (0:) [] INDEF
lower energy limit (MeV) (0:) [] 1000
upper energy limit (MeV) (0:) [] 500000
maximum zenith angle value (degrees) (0:180) [] 90

Next, correct the exposure for the events you filtered out. The way the Fermitools account for exposure is computed based on the Good Time Intervals (GTIs) recorded in the event file. Because we have eliminated some events, we need to update the GTIs for the filtered file. Even though we have used a zenith cut on the data, we will not correct for it here. For an all-sky analysis, an ROI-based zenith cut would eliminate the entire dataset. Instead, we will later correct the livetime calculation for the zenith angle cut made with gtselect.

The gtmktime tool updates the GTI list based on events we eliminated in the previous step. It also provides the ability to filter on any of the parameters stored in the spacecraft pointing history file. We will use the most basic filter recommended by the LAT team.

> gtmktime
Spacecraft data file [] spacecraft.fits
Filter expression [] (DATA_QUAL>0)&&(LAT_CONFIG==1)
Apply ROI-based zenith angle cut [] no
Event data file [] lat_source_zmax90_gt1gev.fits
Output event file name[] lat_source_zmax90_gt1gev_gti.fits

Step 2: Bin the data

Now you will bin the data in preparation for exposure correction. In order to correct the map for exposure, you have to account for the changing response with energy. To do this, bin the data into a counts cube using gtbin, but with a single energy bin.

> gtbin
Type of output file (CCUBE|CMAP|LC|PHA1|PHA2|HEALPIX) [] CCUBE
Event data file name [] lat_source_zmax90_gt1gev_gti.fits
Output file name [] lat_source_zmax90_gt1gev_ccube.fits
Spacecraft data file name [] spacecraft.fits
Size of the X axis in pixels [] 3600
Size of the Y axis in pixels [] 1800
Image scale (in degrees/pixel) [] 0.1
Coordinate system (CEL - celestial, GAL - galactic) (CEL|GAL) [] GAL
First coordinate of image center in degrees (RA or galactic l) [] 0
Second coordinate of image center in degrees (DEC or galactic b) [] 0
Rotation angle of image axis, in degrees [] 0
Projection method e.g. AIT|ARC|CAR|GLS|MER|NCP|SIN|STG|TAN: [] AIT
Algorithm for defining energy bins (FILE|LIN|LOG) [] LOG
Start value for first energy bin in MeV [] 1000
Stop value for last energy bin in MeV [] 500000
Number of logarithmically uniform energy bins [] 1

Step 3: Calculate exposure map

Because Fermi data are sparse, the accumulated exposure time for any given point in the sky is based on the observatory's pointing history and the response of the instrument to events at various incidence angles and different energies. The LAT team has calculated how the instrument responds by modeling the entire instrument and then running a MonteCarlo simulation of many millions of events of different energies and directions. These simulations characterize how the gamma rays propagate through an ideal instrument, and the information is recorded in the Instrument Response Functions (IRF). Since the response depends on how much data filtering is performed, there are different IRFs for the different event classes.

To see which IRFs are available from within the Fermitools, run the gtirfs command.

> gtirfs
P7CLEAN_V6 ( = P7CLEAN_V6::BACK + P7CLEAN_V6::FRONT )
P7CLEAN_V6::BACK
P7CLEAN_V6::FRONT
P7REP_CLEAN_V10 ( = P7REP_CLEAN_V10::FRONT + P7REP_CLEAN_V10::BACK )
P7REP_CLEAN_V10::BACK
P7REP_CLEAN_V10::FRONT
P7REP_CLEAN_V15 ( = P7REP_CLEAN_V15::FRONT + P7REP_CLEAN_V15::BACK )
P7REP_CLEAN_V15::BACK
P7REP_CLEAN_V15::FRONT
P7REP_SOURCE_V10 ( = P7REP_SOURCE_V10::FRONT + P7REP_SOURCE_V10::BACK )
P7REP_SOURCE_V10::BACK
P7REP_SOURCE_V10::FRONT
P7REP_SOURCE_V15 ( = P7REP_SOURCE_V15::FRONT + P7REP_SOURCE_V15::BACK )
P7REP_SOURCE_V15::BACK
P7REP_SOURCE_V15::FRONT

(...Some output suppressed...)

P8R3_ULTRACLEAN_V3 (PSF) ( = P8R3_ULTRACLEAN_V3::PSF0 + P8R3_ULTRACLEAN_V3::PSF1 + P8R3_ULTRACLEAN_V3::PSF2 + P8R3_ULTRACLEAN_V3::PSF3 )
P8R3_ULTRACLEAN_V3::BACK
P8R3_ULTRACLEAN_V3::EDISP0
P8R3_ULTRACLEAN_V3::EDISP1
P8R3_ULTRACLEAN_V3::EDISP2
P8R3_ULTRACLEAN_V3::EDISP3
P8R3_ULTRACLEAN_V3::FRONT
P8R3_ULTRACLEAN_V3::PSF0
P8R3_ULTRACLEAN_V3::PSF1
P8R3_ULTRACLEAN_V3::PSF2
P8R3_ULTRACLEAN_V3::PSF3

The appropriate IRF for this data set is P8R3_SOURCE_V3.

Calculating exposure for an all-sky map first requires calculating the instrument livetime for the entire sky (using gtltcube) and then convolving the livetime with the IRF (using gtexpmap2). The pixel size for the livetime does not need to match the binned data, but you do need to provide the event file as gtltcube uses the good time intervals to calculate the livetime at each position.

You will also need to correct the livetime for the zenith angle cut made with gtselect earlier. To do this, use the "zmax" option on the command line and match the value used with gtselect. This modifies the livetime calculation to account for the events you removed earlier.

Here is an example of the input for gtltcube:

> gtltcube zmax=90
Event data file [] lat_source_zmax90_gt1gev_gti.fits
Spacecraft data file [] spacecraft.fits
Output file [] lat_source_zmax90_gt1gev_ltcube.fits
Step size in cos(theta) (0.:1.) [0.025]
Pixel size (degrees) [1]
Working on file spacecraft.fits
.....................!

Now you can now calculate the exposure at each point in the sky in each energy bin (we used only one bin) using gtexpcube2. You will need to enter the pixel geometry and energy binning to match what you used in the counts cube. Also, be sure to use "CENTER" for the energy layer reference. If you use "EDGE" you will get an additional energy plane in the output file that causes complications later.

>gtexpcube2
Livetime cube file[] lat_source_zmax90_gt1gev_ltcube.fits
Counts map file[] none
Output file name[] lat_source_zmax90_gt1gev_expcube1.fits
Response functions to use[CALDB] CALDB
Size of the X axis in pixels[INDEF] 3600
Size of the Y axis in pixels[INDEF] 1800
Image scale (in degrees/pixel)[INDEF] 0.1
Coordinate system (CEL - celestial, GAL -galactic) (CEL|GAL) [GAL] GAL
First coordinate of image center in degrees (RA or galactic l)[INDEF] 0
Second coordinate of image center in degrees (DEC or galactic b)[INDEF] 0
Rotation angle of image axis, in degrees[0.] 0
Projection method e.g. AIT|ARC|CAR|GLS|MER|NCP|SIN|STG|TAN[CAR] AIT
Start energy (MeV) of first bin[INDEF] 1000
Stop energy (MeV) of last bin[INDEF] 500000
Number of logarithmically-spaced energy bins[INDEF] 1
Computing binned exposure map....................!

Note: If trying to use CALDB in the example above does not work, just put the IRF name in explicitly. The appropriate IRF for this data set is P8R3_SOURCE_V3.

If you open the exposure map with ds9, you will see that the image has two energy planes. The first plane is for 1-300 GeV, the second is for 300 GeV and above. We will only use the first image plane to correct our all-sky map.

Step 4: Correct the all-sky image for exposure and scale

To complete the exposure correction, you must perform a set of arithmetic functions on the all-sky counts cube using the FTOOLS fextract, faith, fimgtrim, and fcarith. First, extract the first of the two exposure planes:

Next, correct the value of each pixel for the exposure:

> farith
Name of 1st FITS file and [ext#] [] lat_source_zmax90_gt1gev_ccube.fits
Name of 2nd FITS file and [ext#] [] lat_source_zmax90_gt1gev_expcube.fits
Name of OUTFIL FITS file [] lat_source_zmax90_gt1gev_corrmap.fits
Operation to perform (ADD,SUB,DIV,MUL(or +,-,/,*),MIN,MAX) [] DIV

Then trim the pixels that were outside the Aitoff projection to zero, so they don't affect your display.

> fimgtrim
Input image file: [] lat_source_zmax90_gt1gev_corrmap.fits
Lower threshold value: [] 0
Lower constant value: [] 0
Upper threshold value: [] INDEF
Output image file: [] lat_source_zmax90_gt1gev_corrmap2.fits

****** successfully exited ******

Finally, scale the image so that the maximum pixel is equal to 255 (the typical maximum dynamic range for graphics). There are a number of ways to find the maximum pixel value. Here we have used ds9 to find a maximum pixel value of 4.143x10-8. This means our scale factor should be 255/4.143x10-8, or 6.155x109.

> fcarith
Name of FITS file and [ext#] [] lat_source_zmax90_gt1gev_corrmap2.fits
Value of the CONSTANT [] 6.155e9
Name of OUTFILE FITS file [] lat_source_zmax90_gt1gev_corrmap3.fits
Enter Operation ADD,SUB,DIV,MUL(or +,-,/,*) [] MUL

You can look at the final product in ds9, using log scaling. This process is approximately the same as is used to create the LAT all-sky images released by the instrument team. This method can be used for different energy ranges, different event class selections, or even to compare different data sets.


Last updated by: N. Mirabal, 04/10/2019