It is recommended that the user first gain some basic familiarity with the structure and content of the LAT data products, as well as the process for preparing the data by making cuts on the data file. This tutorial demonstrates various ways of examining and manipulating the LAT data, but it is a simple approach useful for quickly exploring the data.
IMPORTANT! In almost all cases, light curves and energy spectra need to be produced by a Likelihood Analysis using gtlike for robust results.
In addition to the Fermitools, you will also be using the following FITS File Viewers:
Some of the files used in this tutorial were prepared within the Data Preparation tutorial, and are linked here:
Alternatively, you can select your own region and time period of interest from the LAT data server and substitute that. Photon and spacecraft data files are all that you need for the analysis.
In this section we look at several different ways to explore the data and make quick plots and images of the data. First, we'll look at making quick counts maps with ds9; then we'll perform a more in depth exploration of the data using fv.
To see the data use ds9 to create a quick counts maps of the events in the file.
For example, to look at the 3C 279 data file, type the following command:
prompt> ds9 -bin factor 0.1 0.1 -cmap b -scale sqrt 3C279_region_filtered_gti.fits &
A ds9 window will open up and an image similar to the one shown below will be displayed.
Breaking the command line into its parts, we find that:
Note: The default factor is 1, so if you leave this off the command line you will get 1 degree bins
fv gives you much more interactive control of how you explore the data. It can make plots and 1D and 2D histograms; allow you look at the data directly; and enable you to view the FITS file headers to look at some of the important keywords. Starting it up is easy, just type fv and the filename:
prompt> fv 3C279_region_filtered_gti.fits &
This will bring up two windows: the general fv menu window (at right); and another window shown below, which contains summary information about the file you are looking at:
For the purposes of the tutorial, we will only be using the summary window, but feel free to explore the options in the main menu window as well.
Looking at the summary window, notice that:
From this window, data can be viewed in different ways:
Let's look at each of these in turn.
Click on the button for the EVENTS extension; a new window listing all the header keywords and their values for this extension will be displayed. Notice that the same information is presented that was shown in the summary window; namely that: the data is a binary table (XTENSION='BINTABLE'); there are 123857 entries (NAXIS2=123857); and there are 22 data values for each event (TFIELDS=22).
In addition, there is information about the size (in bytes) of each row and the descriptions of each of the data fields contained in the table.
As you scroll down, you will find some other useful information as shown in the screen shot below:
There are some fairly important keywords on this screen, including:
Finally, as you scroll down to the bottom of the header, you will see:
This part of the header contains information about the data cuts that were made to extract the data. These are contained in the various DSS keywords. For a full description of the meaning of these values see the DSS keyword page. In this file the DSVAL1 keyword tells us which kind of event class has been used, the DSVAL2 keyword tells us that the data was extracted in a circular region, 20 degrees in radius, centered on RA=193.98 and DEC=-5.82. The DSVAL3 keyword shows us that the valid time range is defined by the GTIs. The DSVAL4 keywords shows the selected energy range in MeV, and DSVAL5 indicates that a zenith angle cut has been defined.
fv can also be used to make a quick counts map to see what the region you extracted looks like. To do this:
Note: We use the histogram option rather than the plot option, as that would produce a scatter plot.
fv will automatically fill in the TLMin, TLMax, Data Min and Data Max fields based on the header keywords and data values for that column. It will also make guesses for the values of the Min, Max and Bin Size fields.
In this example, we've selected the limits to be just larger than the values in the Data Min and Data Max field for each column.
For this map, we've selected 0.1 degree bins.
For example, if you wanted to make an approximated flux map, you could select the ENERGY column in the Weight field and the counts would be weighted by their energy.
This will create the plot in a new window and keep the histogram window open in case you want to make changes and create a different image. The "Make/Close" button will create the image and close the histogram window.
fv also allows you to adjust the color and scale, just as you can in ds9. However, it has a different selection of color maps.
As in ds9, the default is gray scale. The image at right was displayed with the cold color map, selected by clicking on the "Colors" menu item, then selecting: "Continuous" submenu --> "cold" check box.
While ds9 and fv can be used to make quick look plots when exploring the data, they don't automatically do all the things you would like when making data files for analysis. For this, you will need to use the Fermi-specific gtbin tool to manipulate the data.
You can use gtbin to bin photon data into the following representations:
This has the advantage of creating the files in exactly the format needed by the other science tools, as well as by other tools such as XSPEC; and of adding correct WCS keywords to the images, so that the coordinate systems are properly displayed when using images viewers (such as ds9 and fv) that can correctly interpret the WCS keywords.
In this section we will use the 3C279_region_filtered_gti.fits file to make images and will look at the results with ds9. In the Explore LAT Data (for Burst) section we will show how to use gtbin to produce a light curve.
Just as with fv and ds9, gtbin can be used to make counts maps out of the extracted data.
The main advantage of using gtbin is that it adds the proper header keywords, so that the coordinate system is properly displayed as you move around the image. Here we'll make the same image of the anti-center region that we make with fv and ds9, but this time we'll use the gtbin tool to make the image.
gtbin is invoked on the command line with or without the name of the file you want to process. If no file name is given, gtbin will prompt for it. Here it is the input and output from running the task:
Type of output file (CCUBE|CMAP|LC|PHA1|PHA2)  CMAP
Event data file name 3C279_region_filtered_gti.fits
Output file name 3C279_region_cmap.fits
Spacecraft data file name NONE
Size of the X axis in pixels 400
Size of the Y axis in pixels 400
Image scale (in degrees/pixel) 0.1
Coordinate system (CEL - celestial, GAL -galactic) (CEL|GAL) [CEL]
First coordinate of image center in degrees (RA or galactic l) 193.98
Second coordinate of image center in degrees (DEC or galactic b) -5.82
Rotation angle of image axis, in degrees[0.]
Projection method e.g. AIT|ARC|CAR|GLS|MER|NCP|SIN|STG|TAN: AIT
gtbin: WARNING: No spacecraft file: EXPOSURE keyword will be set equal to ontime.
There are many different possible projection types. For a small region the difference is small, but you should be aware of it.
When you start the tool, gtbin first asks which type of output you want.
In this case, we want a counts map so we:
The CCUBE (counts cube) option produces a set of count maps over several energy bins.
Note: We select a 400x400 pixel image with 0.1 degree pixels in order to create an image that contains all the extracted data.
Here is the output counts map file to use for comparison.
Compare this result to the images made with fv and ds9 and you will notice that the image is flipped along the y-axis. This is because the coordinate system keywords have been properly added to the image header and the Right Ascension (RA) coordinate actual increases right to left and not left to right. Moving the cursor over the image now shows the RA and Dec of the cursor position in the FK5 fields in the top left section of the display.
If you want to look at coordinates in another system, such as galactic coordinates, you can make the change by first selecting the 'WCS' button (in the first row of buttons), and then the appropriate coordinate system from the choices that appear in the second row of buttons (FK4, FK5, IRCS, Galactic or Ecliptic).
In this section, we explore ways of generating and looking at exposure maps. If you have not yet run gtmktime on the data file you are examining, this analysis will likely yield incorrect results. It is advisable to prepare your data file properly by following the Data Preparation tutorial before looking in detail at the livetime and exposure.
Generally, to look at the exposure you must:
In order to determine the exposure for your source, you need to understand how much time the LAT has observed any given position on the sky at any given inclination angle. gtltcube calculates this 'livetime cube' for the entire sky for the time range covered by the spacecraft file. To do this, you will need to make the livetime cube from the spacecraft (pointing and livetime history) file, using the gtltcube tool.
Event data file  3C279_region_filtered_gti.fits
Spacecraft data file  spacecraft.fits
Output file  3C279_region_ltcube.fits
Step size in cos(theta) (0.:1.)  0.025
Pixel size (degrees)  1
Working on file spacecraft.fits
The event file is needed to get the time range over which to generate the livetime cube.
In some cases, you will have multiple livetime cubes covering different periods of time that you wish to combine in order to examine the exposure over the entire time range. One example would be the researcher who generates weekly flux datapoints for light curves, and has the need to analyze the source significance over a larger time period. In this case, it is much less CPU-intensive to combine previously generated livetime cubes before calculating the exposure map, than to start the livetime cube generation from scratch. To combine multiple livetime cubes into a single cube use the gtltsum tool.
Note: gtltsum is quick, but it does have a few limitations, including:
For example if you want to add four cubes (c1, c2, c3 and c4), you cannot add c1 and c2 to get cube_a, then add cube_a and c3 and save the result as cube_a, you must give it a different name.
Livetime cube 1 or list of files 3C279_region_first_ltcube.fits
Livetime cube 2[none] 3C279_region_second_ltcube.fits
Output file  3C279_region_summed_ltcube.fits
Once you have a livetime cube for the entire dataset, you need to calculate the exposure for your dataset. This can be in the form of an exposure map or an exposure cube.
For simplicity, we will generate an exposure map by running the gtexpmap tool on the event file. As six months is far too much data for an unbinned analysis, this computation will take a long time. Skip past this section to find a copy of the output file.
gtexpmap allows you to control the exposure map parameters, including:
The following example shows input and output for generating an exposure map for the region surrounding 3C 279.
The exposure maps generated by this tool are meant
to be used for *unbinned* likelihood analysis only.
Do not use them for binned analyses.
Event data file 3C279_region_filtered_gti.fits
Spacecraft data file spacecraft.fits
Exposure hypercube file 3C279_region_ltcube.fits
output file name 3C279_exposure_map.fits
Response functions P8R3_SOURCE_V2
Radius of the source region (in degrees) 30
Number of longitude points (2:1000)  500
Number of latitude points (2:1000)  500
Number of energies (2:100)  30
The radius of the source region, 25, should be significantly larger
(say by 10 deg) than the ROI radius of 20
Computing the ExposureMap using 3C279_exposure_cube.fits
You are prompted for the:
For more discussion on the proper instrument response function (IRF) to use in your data analysis, see the overview discussion in the Cicerone, as well as the current recommended data selection information from the LAT team. The LAT data caveats are also important to review before starting LAT analysis.
The next set of parameters specify the size, scale, and position of the map to generate.
The source region is different than the region of interest (ROI). This is the region that you will model when fitting your data. As every region of the sky that contains sources will also have adjacent regions containing sources, it is advisable to model an area larger than that covered by your dataset. Here we have increased the source region by an additional 10°, which is the minimum needed for an actual analysis. Be aware of what sources may be near your region, and model them if appropriate (especially if they are very bright in gamma rays).
This number can be small (∼5) for sources with flat spectra in the LAT regime. However, for sources like pulsars that vary in flux significantly over the LAT energy range, a larger number of energies is recommended, typically 10 per decade in energy.
Here is the output exposure map generated in this example.
Once the file has been generated, it can be viewed with ds9. When you open the file in ds9, a "Data Cube" window will appear, allowing you to select between the various maps generated.
Below are four of the 30 layers in the map, scaled by log(Energy), and extracted from ds9.
|First Layer||Fourth Layer|
|Tenth Layer||Last Layer|
As you can see, the exposure changes as you go to higher energies. This is due to two effects:
Both of these effects are quantified on the LAT performance page.
Last updated: 10/04/2018