Analyzing GRB 080916C

Alt text (NASA/Swift/Cruz deWilde)

To demonstrate the capabilities and features of 3ML in, we will go through a time-integrated and time-resolved analysis. This example serves as a standard way to analyze Fermi-GBM data with 3ML as well as a template for how you can design your instrument’s analysis pipeline with 3ML if you have similar data.

3ML provides utilities to reduce time series data to plugins in a correct and statistically justified way (e.g., background fitting of Poisson data is done with a Poisson likelihood). The approach is generic and can be extended. For more details, see the time series documentation.

[1]:
import warnings

warnings.simplefilter("ignore")
[2]:
%%capture
import matplotlib.pyplot as plt
import numpy as np

np.seterr(all="ignore")


from threeML import *
from threeML.io.package_data import get_path_of_data_file
[3]:

silence_warnings()
%matplotlib inline
from jupyterthemes import jtplot

jtplot.style(context="talk", fscale=1, ticks=True, grid=False)
set_threeML_style()

Examining the catalog

As with Swift and Fermi-LAT, 3ML provides a simple interface to the on-line Fermi-GBM catalog. Let’s get the information for GRB 080916C.

[4]:
gbm_catalog = FermiGBMBurstCatalog()
gbm_catalog.query_sources("GRB080916009")
[4]:
Table length=1
nameradectrigger_timet90
objectfloat64float64float64float64
GRB080916009119.800-56.60054725.008861362.977

To aid in quickly replicating the catalog analysis, and thanks to the tireless efforts of the Fermi-GBM team, we have added the ability to extract the analysis parameters from the catalog as well as build an astromodels model with the best fit parameters baked in. Using this information, one can quickly run through the catalog an replicate the entire analysis with a script. Let’s give it a try.

[5]:
grb_info = gbm_catalog.get_detector_information()["GRB080916009"]

gbm_detectors = grb_info["detectors"]
source_interval = grb_info["source"]["fluence"]
background_interval = grb_info["background"]["full"]
best_fit_model = grb_info["best fit model"]["fluence"]
model = gbm_catalog.get_model(best_fit_model, "fluence")["GRB080916009"]
[6]:
model
[6]:
Model summary:

N
Point sources 1
Extended sources 0
Particle sources 0


Free parameters (5):

value min_value max_value unit
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.K 0.012255 0.0 None keV-1 s-1 cm-2
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.alpha -1.130424 -1.5 2.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_energy 309.2031 10.0 None keV
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_scale 0.3 0.0 10.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.beta -2.096931 -5.0 -1.6


Fixed parameters (3):
(abridged. Use complete=True to see all fixed parameters)


Linked parameters (0):

(none)

Independent variables:

(none)

Downloading the data

We provide a simple interface to download the Fermi-GBM data. Using the information from the catalog that we have extracted, we can download just the data from the detectors that were used for the catalog analysis. This will download the CSPEC, TTE and instrument response files from the on-line database.

[7]:
dload = download_GBM_trigger_data("bn080916009", detectors=gbm_detectors)

Let’s first examine the catalog fluence fit. Using the TimeSeriesBuilder, we can fit the background, set the source interval, and create a 3ML plugin for the analysis. We will loop through the detectors, set their appropriate channel selections, and ensure there are enough counts in each bin to make the PGStat profile likelihood valid.

  • First we use the CSPEC data to fit the background using the background selections. We use CSPEC because it has a longer duration for fitting the background.

  • The background is saved to an HDF5 file that stores the polynomial coefficients and selections which we can read in to the TTE file later.

  • The light curve is plotted.

  • The source selection from the catalog is set and DispersionSpectrumLike plugin is created.

  • The plugin has the standard GBM channel selections for spectral analysis set.

[8]:
fluence_plugins = []
time_series = {}
for det in gbm_detectors:

    ts_cspec = TimeSeriesBuilder.from_gbm_cspec_or_ctime(
        det, cspec_or_ctime_file=dload[det]["cspec"], rsp_file=dload[det]["rsp"]
    )

    ts_cspec.set_background_interval(*background_interval.split(","))
    ts_cspec.save_background(f"{det}_bkg.h5", overwrite=True)

    ts_tte = TimeSeriesBuilder.from_gbm_tte(
        det,
        tte_file=dload[det]["tte"],
        rsp_file=dload[det]["rsp"],
        restore_background=f"{det}_bkg.h5",
    )

    time_series[det] = ts_tte

    ts_tte.set_active_time_interval(source_interval)

    ts_tte.view_lightcurve(-40, 100)

    fluence_plugin = ts_tte.to_spectrumlike()

    if det.startswith("b"):

        fluence_plugin.set_active_measurements("250-30000")

    else:

        fluence_plugin.set_active_measurements("9-900")

    fluence_plugin.rebin_on_background(1.0)

    fluence_plugins.append(fluence_plugin)
../_images/notebooks_grb080916C_12_9.png
../_images/notebooks_grb080916C_12_10.png
../_images/notebooks_grb080916C_12_11.png

Setting up the fit

Let’s see if we can reproduce the results from the catalog.

Set priors for the model

We will fit the spectrum using Bayesian analysis, so we must set priors on the model parameters.

[9]:
model.GRB080916009.spectrum.main.shape.alpha.prior = Truncated_gaussian(
    lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.beta.prior = Truncated_gaussian(
    lower_bound=-5, upper_bound=-1.6, mu=-2.25, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.break_energy.prior = Log_normal(mu=2, sigma=1)
model.GRB080916009.spectrum.main.shape.break_energy.bounds = (None, None)
model.GRB080916009.spectrum.main.shape.K.prior = Log_uniform_prior(
    lower_bound=1e-3, upper_bound=1e1
)
model.GRB080916009.spectrum.main.shape.break_scale.prior = Log_uniform_prior(
    lower_bound=1e-4, upper_bound=10
)

Clone the model and setup the Bayesian analysis class

Next, we clone the model we built from the catalog so that we can look at the results later and fit the cloned model. We pass this model and the DataList of the plugins to a BayesianAnalysis class and set the sampler to MultiNest.

[10]:
new_model = clone_model(model)

bayes = BayesianAnalysis(new_model, DataList(*fluence_plugins))

# share spectrum gives a linear speed up when
# spectrumlike plugins have the same RSP input energies
bayes.set_sampler("multinest", share_spectrum=True)

Examine at the catalog fitted model

We can quickly examine how well the catalog fit matches the data. There appears to be a discrepancy between the data and the model! Let’s refit to see if we can fix it.

[11]:
fig = display_spectrum_model_counts(bayes, min_rate=20, step=False)
../_images/notebooks_grb080916C_18_0.png

Run the sampler

We let MultiNest condition the model on the data

[12]:
bayes.sampler.setup(n_live_points=400)
bayes.sample()
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
GRB080916009...K (1.470 -0.017 +0.018) x 10^-2 1 / (cm2 keV s)
GRB080916009...alpha -1.074 +/- 0.019
GRB080916009...break_energy (2.29 -0.29 +0.30) x 10^2 keV
GRB080916009...break_scale (2.5 +/- 0.8) x 10^-1
GRB080916009...beta -2.10 -0.10 +0.09

Values of -log(posterior) at the minimum:

-log(posterior)
b0 -1051.732777
n3 -1021.880413
n4 -1014.066310
total -3087.679500

Values of statistical measures:

statistical measures
AIC 6185.529454
BIC 6204.761664
DIC 6172.115935
PDIC 4.407602
log(Z) -1346.866178
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  400
 dimensionality =    5
 *****************************************************
 ln(ev)=  -3101.2739839188271      +/-  0.22544206382306003
 Total Likelihood Evaluations:        22781
 Sampling finished. Exiting MultiNest

Now our model seems to match much better with the data!

[13]:
bayes.restore_median_fit()
fig = display_spectrum_model_counts(bayes, min_rate=20)
../_images/notebooks_grb080916C_22_0.png

But how different are we from the catalog model? Let’s plot our fit along with the catalog model. Luckily, 3ML can handle all the units for is

[14]:
conversion = u.Unit("keV2/(cm2 s keV)").to("erg2/(cm2 s keV)")
energy_grid = np.logspace(1, 4, 100) * u.keV
vFv = (energy_grid ** 2 * model.get_point_source_fluxes(0, energy_grid)).to(
    "erg2/(cm2 s keV)"
)
[15]:
fig = plot_spectra(bayes.results, flux_unit="erg2/(cm2 s keV)")
ax = fig.get_axes()[0]
_ = ax.loglog(energy_grid, vFv, color="blue", label="catalog model")
../_images/notebooks_grb080916C_25_2.png

Time Resolved Analysis

Now that we have examined fluence fit, we can move to performing a time-resolved analysis.

Selecting a temporal binning

We first get the brightest NaI detector and create time bins via the Bayesian blocks algorithm. We can use the fitted background to make sure that our intervals are chosen in an unbiased way.

[16]:
n3 = time_series["n3"]
[17]:
n3.create_time_bins(0, 60, method="bayesblocks", use_background=True, p0=0.2)

Sometimes, glitches in the GBM data cause spikes in the data that the Bayesian blocks algorithm detects as fast changes in the count rate. We will have to remove those intervals manually.

Note: In the future, 3ML will provide an automated method to remove these unwanted spikes.

[18]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_30_0.png
[19]:
bad_bins = []
for i, w in enumerate(n3.bins.widths):

    if w < 5e-2:
        bad_bins.append(i)


edges = [n3.bins.starts[0]]

for i, b in enumerate(n3.bins):

    if i not in bad_bins:
        edges.append(b.stop)

starts = edges[:-1]
stops = edges[1:]


n3.create_time_bins(starts, stops, method="custom")

Now our light curve looks much more acceptable.

[20]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_33_0.png

The time series objects can read time bins from each other, so we will map these time bins onto the other detectors’ time series and create a list of time plugins for each detector and each time bin created above.

[21]:
time_resolved_plugins = {}

for k, v in time_series.items():
    v.read_bins(n3)
    time_resolved_plugins[k] = v.to_spectrumlike(from_bins=True)

Setting up the model

For the time-resolved analysis, we will fit the classic Band function to the data. We will set some principled priors.

[22]:
band = Band()
band.alpha.prior = Truncated_gaussian(lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5)
band.beta.prior = Truncated_gaussian(lower_bound=-5, upper_bound=-1.6, mu=-2, sigma=0.5)
band.xp.prior = Log_normal(mu=2, sigma=1)
band.xp.bounds = (0, None)
band.K.prior = Log_uniform_prior(lower_bound=1e-10, upper_bound=1e3)
ps = PointSource("grb", 0, 0, spectral_shape=band)
band_model = Model(ps)

Perform the fits

One way to perform Bayesian spectral fits to all the intervals is to loop through each one. There can are many ways to do this, so find an analysis pattern that works for you.

[23]:
models = []
results = []
analysis = []
for interval in range(12):

    # clone the model above so that we have a separate model
    # for each fit

    this_model = clone_model(band_model)

    # for each detector set up the plugin
    # for this time interval

    this_data_list = []
    for k, v in time_resolved_plugins.items():

        pi = v[interval]

        if k.startswith("b"):
            pi.set_active_measurements("250-30000")
        else:
            pi.set_active_measurements("9-900")

        pi.rebin_on_background(1.0)

        this_data_list.append(pi)

    # create a data list

    dlist = DataList(*this_data_list)

    # set up the sampler and fit

    bayes = BayesianAnalysis(this_model, dlist)

    # get some speed with share spectrum
    bayes.set_sampler("multinest", share_spectrum=True)
    bayes.sampler.setup(n_live_points=500)
    bayes.sample()

    # at this stage we coudl also
    # save the analysis result to
    # disk but we will simply hold
    # onto them in memory

    analysis.append(bayes)
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.20 -0.25 +0.24) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-6.9 -0.5 +0.4) x 10^-1
grb.spectrum.main.Band.xp (3.3 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -1.794 -0.033 +0.04

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval0 -281.023947
n3_interval0 -244.526350
n4_interval0 -261.580637
total -787.130933

Values of statistical measures:

statistical measures
AIC 1582.375181
BIC 1597.783999
DIC 1561.117744
PDIC 2.108533
log(Z) -344.694315
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (4.09 +/- 0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.64 -0.26 +0.25) x 10^-1
grb.spectrum.main.Band.xp (6.4 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.18 +/- 0.09

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval1 -665.657390
n3_interval1 -633.881699
n4_interval1 -637.796932
total -1937.336022

Values of statistical measures:

statistical measures
AIC 3882.785359
BIC 3898.194177
DIC 3857.076474
PDIC 3.564376
log(Z) -843.923326
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.60 -0.20 +0.21) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.04 +/- 0.06
grb.spectrum.main.Band.xp (5.9 -1.5 +1.4) x 10^2 keV
grb.spectrum.main.Band.beta -1.93 +/- 0.13

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval2 -317.159339
n3_interval2 -281.892419
n4_interval2 -305.281684
total -904.333442

Values of statistical measures:

statistical measures
AIC 1816.780199
BIC 1832.189016
DIC 1789.232171
PDIC 2.430302
log(Z) -393.621127
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.96 -0.35 +0.34) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.1 +/- 0.9) x 10^-1
grb.spectrum.main.Band.xp (3.5 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.45 +/- 0.30

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval3 -292.493502
n3_interval3 -236.729102
n4_interval3 -256.543524
total -785.766128

Values of statistical measures:

statistical measures
AIC 1579.645570
BIC 1595.054388
DIC 1557.675920
PDIC 2.772570
log(Z) -342.308356
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.00 +/- 0.10) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.004 -0.035 +0.034
grb.spectrum.main.Band.xp (4.4 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.20 -0.11 +0.10

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval4 -773.012560
n3_interval4 -751.080549
n4_interval4 -740.502045
total -2264.595154

Values of statistical measures:

statistical measures
AIC 4537.303622
BIC 4552.712440
DIC 4517.202144
PDIC 2.635350
log(Z) -987.169531
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.85 +/- 0.17) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.0 +/- 0.5) x 10^-1
grb.spectrum.main.Band.xp (4.1 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.13 -0.13 +0.12

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval5 -530.328738
n3_interval5 -516.989309
n4_interval5 -521.011955
total -1568.330002

Values of statistical measures:

statistical measures
AIC 3144.773318
BIC 3160.182136
DIC 3122.672746
PDIC 3.071961
log(Z) -683.391120
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.96 +/- 0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.01 +/- 0.05
grb.spectrum.main.Band.xp (4.6 +/- 0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.46 +/- 0.29

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval6 -603.149533
n3_interval6 -577.463303
n4_interval6 -570.485271
total -1751.098107

Values of statistical measures:

statistical measures
AIC 3510.309529
BIC 3525.718347
DIC 3487.158023
PDIC 3.163839
log(Z) -762.273262
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.68 +/- 0.11) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.04 +/- 0.05
grb.spectrum.main.Band.xp (4.3 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.12 -0.06 +0.04

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval7 -655.982702
n3_interval7 -634.358952
n4_interval7 -644.086932
total -1934.428586

Values of statistical measures:

statistical measures
AIC 3876.970486
BIC 3892.379303
DIC 3855.389448
PDIC 2.902452
log(Z) -842.584867
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.53 +/- 0.12) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.5 +/- 0.6) x 10^-1
grb.spectrum.main.Band.xp (3.9 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.43 -0.27 +0.26

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval8 -696.311837
n3_interval8 -692.245698
n4_interval8 -660.397033
total -2048.954568

Values of statistical measures:

statistical measures
AIC 4106.022450
BIC 4121.431267
DIC 4085.768371
PDIC 3.321341
log(Z) -891.868469
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.8 -1.0 +0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-7.6 -2.4 +2.2) x 10^-1
grb.spectrum.main.Band.xp (1.1 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -1.92 -0.23 +0.21

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval9 -645.568029
n3_interval9 -614.374378
n4_interval9 -613.466901
total -1873.409308

Values of statistical measures:

statistical measures
AIC 3754.931931
BIC 3770.340748
DIC 3677.589609
PDIC -63.435389
log(Z) -815.921469
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.4 +/- 0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-6.8 -1.6 +1.5) x 10^-1
grb.spectrum.main.Band.xp (2.0 -0.4 +0.5) x 10^2 keV
grb.spectrum.main.Band.beta -1.85 +/- 0.08

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval10 -456.306652
n3_interval10 -432.716733
n4_interval10 -428.787323
total -1317.810708

Values of statistical measures:

statistical measures
AIC 2643.734731
BIC 2659.143548
DIC 2624.188513
PDIC 0.060321
log(Z) -574.651629
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.4 -1.5 +1.3) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-4.4 -2.8 +2.6) x 10^-1
grb.spectrum.main.Band.xp (1.32 -0.30 +0.29) x 10^2 keV
grb.spectrum.main.Band.beta -2.28 -0.33 +0.31

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval11 -288.583488
n3_interval11 -268.670128
n4_interval11 -251.830625
total -809.084241

Values of statistical measures:

statistical measures
AIC 1626.281796
BIC 1641.690613
DIC 1607.236621
PDIC -1.718849
log(Z) -352.563091
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -793.68799216047671      +/-  0.19407790771924158
 Total Likelihood Evaluations:        18168
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1943.2052711506869      +/-  0.21281347840650378
 Total Likelihood Evaluations:        22254
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -906.34614007416974      +/-  0.19326396810368568
 Total Likelihood Evaluations:        20266
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -788.19411770286024      +/-  0.17682711989755709
 Total Likelihood Evaluations:        16755
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2273.0418469320430      +/-  0.20393590958778587
 Total Likelihood Evaluations:        19735
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1573.5662045383090      +/-  0.19378357152893602
 Total Likelihood Evaluations:        20736
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1755.1990501958767      +/-  0.19243593621178473
 Total Likelihood Evaluations:        19880
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1940.1233539336022      +/-  0.19393489728501576
 Total Likelihood Evaluations:        20446
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2053.6030426908856      +/-  0.18548331175654012
 Total Likelihood Evaluations:        21885
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1878.7286117372562      +/-  0.14465920461438692
 Total Likelihood Evaluations:        13326
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1323.1842741922358      +/-  0.17116938223954647
 Total Likelihood Evaluations:        15167
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -811.80651880798575      +/-  0.14558770425964063
 Total Likelihood Evaluations:        12851
 Sampling finished. Exiting MultiNest

Examine the fits

Now we can look at the fits in count space to make sure they are ok.

[24]:
for a in analysis:
    a.restore_median_fit()
    _ = display_spectrum_model_counts(a, min_rate=[20, 20, -99], step=False)
../_images/notebooks_grb080916C_41_0.png
../_images/notebooks_grb080916C_41_1.png
../_images/notebooks_grb080916C_41_2.png
../_images/notebooks_grb080916C_41_3.png
../_images/notebooks_grb080916C_41_4.png
../_images/notebooks_grb080916C_41_5.png
../_images/notebooks_grb080916C_41_6.png
../_images/notebooks_grb080916C_41_7.png
../_images/notebooks_grb080916C_41_8.png
../_images/notebooks_grb080916C_41_9.png
../_images/notebooks_grb080916C_41_10.png
../_images/notebooks_grb080916C_41_11.png

Finally, we can plot the models together to see how the spectra evolve with time.

[25]:
fig = plot_spectra(
    *[a.results for a in analysis[::1]],
    flux_unit="erg2/(cm2 s keV)",
    fit_cmap="viridis",
    contour_cmap="viridis",
    contour_style_kwargs=dict(alpha=0.1),
)
../_images/notebooks_grb080916C_43_13.png

This example can serve as a template for performing analysis on GBM data. However, as 3ML provides an abstract interface and modular building blocks, similar analysis pipelines can be built for any time series data.