Analyzing GRB 080916C

Alt text (NASA/Swift/Cruz deWilde)

To demonstrate the capabilities and features of 3ML in, we will go through a time-integrated and time-resolved analysis. This example serves as a standard way to analyze Fermi-GBM data with 3ML as well as a template for how you can design your instrument’s analysis pipeline with 3ML if you have similar data.

3ML provides utilities to reduce time series data to plugins in a correct and statistically justified way (e.g., background fitting of Poisson data is done with a Poisson likelihood). The approach is generic and can be extended. For more details, see the time series documentation.

[2]:
%%capture
import matplotlib.pyplot as plt
import numpy as np

np.seterr(all="ignore")


from threeML import *
from threeML.io.package_data import get_path_of_data_file

Examining the catalog

As with Swift and Fermi-LAT, 3ML provides a simple interface to the on-line Fermi-GBM catalog. Let’s get the information for GRB 080916C.

[4]:
gbm_catalog = FermiGBMBurstCatalog()
gbm_catalog.query_sources("GRB080916009")
[4]:
Table length=1
nameradectrigger_timet90
objectfloat64float64float64float64
GRB080916009119.800-56.60054725.008861362.977

To aid in quickly replicating the catalog analysis, and thanks to the tireless efforts of the Fermi-GBM team, we have added the ability to extract the analysis parameters from the catalog as well as build an astromodels model with the best fit parameters baked in. Using this information, one can quickly run through the catalog an replicate the entire analysis with a script. Let’s give it a try.

[5]:
grb_info = gbm_catalog.get_detector_information()["GRB080916009"]

gbm_detectors = grb_info["detectors"]
source_interval = grb_info["source"]["fluence"]
background_interval = grb_info["background"]["full"]
best_fit_model = grb_info["best fit model"]["fluence"]
model = gbm_catalog.get_model(best_fit_model, "fluence")["GRB080916009"]
[6]:
model
[6]:
Model summary:

N
Point sources 1
Extended sources 0
Particle sources 0


Free parameters (5):

value min_value max_value unit
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.K 0.012255 0.0 None keV-1 s-1 cm-2
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.alpha -1.130424 -1.5 2.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_energy 309.2031 10.0 None keV
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_scale 0.3 0.0 10.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.beta -2.096931 -5.0 -1.6


Fixed parameters (3):
(abridged. Use complete=True to see all fixed parameters)


Linked parameters (0):

(none)

Independent variables:

(none)

Downloading the data

We provide a simple interface to download the Fermi-GBM data. Using the information from the catalog that we have extracted, we can download just the data from the detectors that were used for the catalog analysis. This will download the CSPEC, TTE and instrument response files from the on-line database.

[7]:
dload = download_GBM_trigger_data("bn080916009", detectors=gbm_detectors)

Let’s first examine the catalog fluence fit. Using the TimeSeriesBuilder, we can fit the background, set the source interval, and create a 3ML plugin for the analysis. We will loop through the detectors, set their appropriate channel selections, and ensure there are enough counts in each bin to make the PGStat profile likelihood valid.

  • First we use the CSPEC data to fit the background using the background selections. We use CSPEC because it has a longer duration for fitting the background.

  • The background is saved to an HDF5 file that stores the polynomial coefficients and selections which we can read in to the TTE file later.

  • The light curve is plotted.

  • The source selection from the catalog is set and DispersionSpectrumLike plugin is created.

  • The plugin has the standard GBM channel selections for spectral analysis set.

[8]:
fluence_plugins = []
time_series = {}
for det in gbm_detectors:

    ts_cspec = TimeSeriesBuilder.from_gbm_cspec_or_ctime(
        det, cspec_or_ctime_file=dload[det]["cspec"], rsp_file=dload[det]["rsp"]
    )

    ts_cspec.set_background_interval(*background_interval.split(","))
    ts_cspec.save_background(f"{det}_bkg.h5", overwrite=True)

    ts_tte = TimeSeriesBuilder.from_gbm_tte(
        det,
        tte_file=dload[det]["tte"],
        rsp_file=dload[det]["rsp"],
        restore_background=f"{det}_bkg.h5",
    )

    time_series[det] = ts_tte

    ts_tte.set_active_time_interval(source_interval)

    ts_tte.view_lightcurve(-40, 100)

    fluence_plugin = ts_tte.to_spectrumlike()

    if det.startswith("b"):

        fluence_plugin.set_active_measurements("250-30000")

    else:

        fluence_plugin.set_active_measurements("9-900")

    fluence_plugin.rebin_on_background(1.0)

    fluence_plugins.append(fluence_plugin)
../_images/notebooks_grb080916C_12_9.png
../_images/notebooks_grb080916C_12_10.png
../_images/notebooks_grb080916C_12_11.png

Setting up the fit

Let’s see if we can reproduce the results from the catalog.

Set priors for the model

We will fit the spectrum using Bayesian analysis, so we must set priors on the model parameters.

[9]:
model.GRB080916009.spectrum.main.shape.alpha.prior = Truncated_gaussian(
    lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.beta.prior = Truncated_gaussian(
    lower_bound=-5, upper_bound=-1.6, mu=-2.25, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.break_energy.prior = Log_normal(mu=2, sigma=1)
model.GRB080916009.spectrum.main.shape.break_energy.bounds = (None, None)
model.GRB080916009.spectrum.main.shape.K.prior = Log_uniform_prior(
    lower_bound=1e-3, upper_bound=1e1
)
model.GRB080916009.spectrum.main.shape.break_scale.prior = Log_uniform_prior(
    lower_bound=1e-4, upper_bound=10
)

Clone the model and setup the Bayesian analysis class

Next, we clone the model we built from the catalog so that we can look at the results later and fit the cloned model. We pass this model and the DataList of the plugins to a BayesianAnalysis class and set the sampler to MultiNest.

[10]:
new_model = clone_model(model)

bayes = BayesianAnalysis(new_model, DataList(*fluence_plugins))

# share spectrum gives a linear speed up when
# spectrumlike plugins have the same RSP input energies
bayes.set_sampler("multinest", share_spectrum=True)

Examine at the catalog fitted model

We can quickly examine how well the catalog fit matches the data. There appears to be a discrepancy between the data and the model! Let’s refit to see if we can fix it.

[11]:
fig = display_spectrum_model_counts(bayes, min_rate=20, step=False)
../_images/notebooks_grb080916C_18_0.png

Run the sampler

We let MultiNest condition the model on the data

[12]:
bayes.sampler.setup(n_live_points=400)
bayes.sample()
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
GRB080916009...K (1.469 -0.018 +0.019) x 10^-2 1 / (cm2 keV s)
GRB080916009...alpha -1.071 -0.018 +0.019
GRB080916009...break_energy (2.30 -0.31 +0.33) x 10^2 keV
GRB080916009...break_scale (2.3 -0.8 +0.9) x 10^-1
GRB080916009...beta -2.17 +/- 0.12

Values of -log(posterior) at the minimum:

-log(posterior)
b0 -1051.847164
n3 -1023.454306
n4 -1014.309805
total -3089.611275

Values of statistical measures:

statistical measures
AIC 6189.393004
BIC 6208.625215
DIC 6176.650563
PDIC 4.575138
log(Z) -1347.537224
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  400
 dimensionality =    5
 *****************************************************
 ln(ev)=  -3102.8191238915133      +/-  0.22134116091795444
 Total Likelihood Evaluations:        25706
 Sampling finished. Exiting MultiNest

Now our model seems to match much better with the data!

[13]:
bayes.restore_median_fit()
fig = display_spectrum_model_counts(bayes, min_rate=20)
../_images/notebooks_grb080916C_22_0.png

But how different are we from the catalog model? Let’s plot our fit along with the catalog model. Luckily, 3ML can handle all the units for is

[14]:
conversion = u.Unit("keV2/(cm2 s keV)").to("erg2/(cm2 s keV)")
energy_grid = np.logspace(1, 4, 100) * u.keV
vFv = (energy_grid ** 2 * model.get_point_source_fluxes(0, energy_grid)).to(
    "erg2/(cm2 s keV)"
)
[15]:
fig = plot_spectra(bayes.results, flux_unit="erg2/(cm2 s keV)")
ax = fig.get_axes()[0]
_ = ax.loglog(energy_grid, vFv, color="blue", label="catalog model")
../_images/notebooks_grb080916C_25_3.png

Time Resolved Analysis

Now that we have examined fluence fit, we can move to performing a time-resolved analysis.

Selecting a temporal binning

We first get the brightest NaI detector and create time bins via the Bayesian blocks algorithm. We can use the fitted background to make sure that our intervals are chosen in an unbiased way.

[16]:
n3 = time_series["n3"]
[17]:
n3.create_time_bins(0, 60, method="bayesblocks", use_background=True, p0=0.2)

Sometimes, glitches in the GBM data cause spikes in the data that the Bayesian blocks algorithm detects as fast changes in the count rate. We will have to remove those intervals manually.

Note: In the future, 3ML will provide an automated method to remove these unwanted spikes.

[18]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_30_0.png
[19]:
bad_bins = []
for i, w in enumerate(n3.bins.widths):

    if w < 5e-2:
        bad_bins.append(i)


edges = [n3.bins.starts[0]]

for i, b in enumerate(n3.bins):

    if i not in bad_bins:
        edges.append(b.stop)

starts = edges[:-1]
stops = edges[1:]


n3.create_time_bins(starts, stops, method="custom")

Now our light curve looks much more acceptable.

[20]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_33_0.png

The time series objects can read time bins from each other, so we will map these time bins onto the other detectors’ time series and create a list of time plugins for each detector and each time bin created above.

[21]:
time_resolved_plugins = {}

for k, v in time_series.items():
    v.read_bins(n3)
    time_resolved_plugins[k] = v.to_spectrumlike(from_bins=True)

Setting up the model

For the time-resolved analysis, we will fit the classic Band function to the data. We will set some principled priors.

[22]:
band = Band()
band.alpha.prior = Truncated_gaussian(lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5)
band.beta.prior = Truncated_gaussian(lower_bound=-5, upper_bound=-1.6, mu=-2, sigma=0.5)
band.xp.prior = Log_normal(mu=2, sigma=1)
band.xp.bounds = (0, None)
band.K.prior = Log_uniform_prior(lower_bound=1e-10, upper_bound=1e3)
ps = PointSource("grb", 0, 0, spectral_shape=band)
band_model = Model(ps)

Perform the fits

One way to perform Bayesian spectral fits to all the intervals is to loop through each one. There can are many ways to do this, so find an analysis pattern that works for you.

[23]:
models = []
results = []
analysis = []
for interval in range(12):

    # clone the model above so that we have a separate model
    # for each fit

    this_model = clone_model(band_model)

    # for each detector set up the plugin
    # for this time interval

    this_data_list = []
    for k, v in time_resolved_plugins.items():

        pi = v[interval]

        if k.startswith("b"):
            pi.set_active_measurements("250-30000")
        else:
            pi.set_active_measurements("9-900")

        pi.rebin_on_background(1.0)

        this_data_list.append(pi)

    # create a data list

    dlist = DataList(*this_data_list)

    # set up the sampler and fit

    bayes = BayesianAnalysis(this_model, dlist)

    # get some speed with share spectrum
    bayes.set_sampler("multinest", share_spectrum=True)
    bayes.sampler.setup(n_live_points=500)
    bayes.sample()

    # at this stage we coudl also
    # save the analysis result to
    # disk but we will simply hold
    # onto them in memory

    analysis.append(bayes)
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.5 +/- 0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-5.9 -1.2 +1.4) x 10^-1
grb.spectrum.main.Band.xp (3.3 -0.8 +0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.5 -0.4 +0.7

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval0 -280.661910
n3_interval0 -245.574791
n4_interval0 -261.970675
total -788.207376

Values of statistical measures:

statistical measures
AIC 1584.528067
BIC 1599.936885
DIC 1562.461909
PDIC 2.468858
log(Z) -344.136575
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (4.15 +/- 0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.50 +/- 0.27) x 10^-1
grb.spectrum.main.Band.xp (6.1 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.21 +/- 0.09

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval1 -665.323834
n3_interval1 -634.521755
n4_interval1 -638.073698
total -1937.919287

Values of statistical measures:

statistical measures
AIC 3883.951888
BIC 3899.360705
DIC 3858.369351
PDIC 3.466320
log(Z) -844.114238
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.54 -0.24 +0.21) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.05 -0.08 +0.07
grb.spectrum.main.Band.xp (6.6 +/- 1.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.48 -0.26 +0.25

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval2 -317.245007
n3_interval2 -283.689296
n4_interval2 -306.251807
total -907.186111

Values of statistical measures:

statistical measures
AIC 1822.485537
BIC 1837.894354
DIC 1792.718702
PDIC 2.934548
log(Z) -394.302140
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.0 +/- 0.4) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.1 +/- 0.9) x 10^-1
grb.spectrum.main.Band.xp (3.3 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.14 +/- 0.18

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval3 -291.977080
n3_interval3 -237.201530
n4_interval3 -257.228944
total -786.407555

Values of statistical measures:

statistical measures
AIC 1580.928424
BIC 1596.337241
DIC 1559.788488
PDIC 2.677593
log(Z) -342.915933
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.06 +/- 0.11) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.7 +/- 0.4) x 10^-1
grb.spectrum.main.Band.xp (4.0 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.08 -0.12 +0.13

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval4 -773.588325
n3_interval4 -751.411448
n4_interval4 -741.051640
total -2266.051413

Values of statistical measures:

statistical measures
AIC 4540.216140
BIC 4555.624958
DIC 4520.755226
PDIC 3.697949
log(Z) -986.887264
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.81 -0.19 +0.20) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.0 +/- 0.5) x 10^-1
grb.spectrum.main.Band.xp (4.2 -0.6 +0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.29 -0.20 +0.21

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval5 -531.955628
n3_interval5 -517.558792
n4_interval5 -522.650990
total -1572.165410

Values of statistical measures:

statistical measures
AIC 3152.444135
BIC 3167.852952
DIC 3130.780281
PDIC 3.298499
log(Z) -684.773776
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.98 +/- 0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-10.0 +/- 0.5) x 10^-1
grb.spectrum.main.Band.xp (4.3 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.56 +/- 0.29

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval6 -607.046699
n3_interval6 -577.957348
n4_interval6 -570.974732
total -1755.978779

Values of statistical measures:

statistical measures
AIC 3520.070872
BIC 3535.479690
DIC 3496.931504
PDIC 3.346854
log(Z) -764.183846
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.72 +/- 0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.02 +/- 0.05
grb.spectrum.main.Band.xp (4.0 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.23 -0.11 +0.12

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval7 -659.635644
n3_interval7 -634.950290
n4_interval7 -644.360216
total -1938.946149

Values of statistical measures:

statistical measures
AIC 3886.005613
BIC 3901.414431
DIC 3864.441900
PDIC 3.107498
log(Z) -844.612980
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.57 +/- 0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.2 +/- 0.6) x 10^-1
grb.spectrum.main.Band.xp (3.6 -0.5 +0.4) x 10^2 keV
grb.spectrum.main.Band.beta -2.52 -0.28 +0.27

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval8 -696.996486
n3_interval8 -693.560595
n4_interval8 -660.969725
total -2051.526806

Values of statistical measures:

statistical measures
AIC 4111.166926
BIC 4126.575744
DIC 4090.803784
PDIC 3.392360
log(Z) -892.929754
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.16 -0.5 +0.21) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.0 -1.8 +1.6) x 10^-1
grb.spectrum.main.Band.xp (1.4 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -2.20 -0.34 +0.4

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval9 -646.943463
n3_interval9 -615.011418
n4_interval9 -613.726997
total -1875.681878

Values of statistical measures:

statistical measures
AIC 3759.477071
BIC 3774.885888
DIC 3731.968540
PDIC -13.468883
log(Z) -817.110257
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.1 +/- 0.4) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-7.4 -1.4 +1.3) x 10^-1
grb.spectrum.main.Band.xp (2.3 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.25 -0.34 +0.33

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval10 -457.152346
n3_interval10 -433.848555
n4_interval10 -429.381908
total -1320.382809

Values of statistical measures:

statistical measures
AIC 2648.878932
BIC 2664.287749
DIC 2630.504162
PDIC 1.672146
log(Z) -575.099140
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.5 -1.6 +1.9) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-4.3 -2.6 +2.9) x 10^-1
grb.spectrum.main.Band.xp (1.27 -0.29 +0.27) x 10^2 keV
grb.spectrum.main.Band.beta -2.25 -0.33 +0.32

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval11 -289.615220
n3_interval11 -268.917839
n4_interval11 -252.640037
total -811.173096

Values of statistical measures:

statistical measures
AIC 1630.459507
BIC 1645.868324
DIC 1611.436494
PDIC -1.232983
log(Z) -353.415743
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -792.40374869823870      +/-  0.18726371901169406
 Total Likelihood Evaluations:        16449
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1943.6448614695250      +/-  0.21104976586841326
 Total Likelihood Evaluations:        22949
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -907.91423057024940      +/-  0.19625756103174202
 Total Likelihood Evaluations:        20543
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -789.59311494471262      +/-  0.17657976502663780
 Total Likelihood Evaluations:        18049
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2272.3919025995592      +/-  0.19321528103848717
 Total Likelihood Evaluations:        22254
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1576.7498879118564      +/-  0.19044495421651592
 Total Likelihood Evaluations:        20799
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1759.5983326736591      +/-  0.19011937338574139
 Total Likelihood Evaluations:        21553
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1944.7932575778968      +/-  0.19442440091576232
 Total Likelihood Evaluations:        19326
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2056.0467407053611      +/-  0.18513000002175817
 Total Likelihood Evaluations:        20914
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1881.4658971189560      +/-  0.15004339439211600
 Total Likelihood Evaluations:        12507
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1324.2147062007809      +/-  0.16547906519588984
 Total Likelihood Evaluations:        15636
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -813.76982085850466      +/-  0.14617952512622973
 Total Likelihood Evaluations:        11987
 Sampling finished. Exiting MultiNest

Examine the fits

Now we can look at the fits in count space to make sure they are ok.

[24]:
for a in analysis:
    a.restore_median_fit()
    _ = display_spectrum_model_counts(a, min_rate=[20, 20, -99], step=False)
../_images/notebooks_grb080916C_41_0.png
../_images/notebooks_grb080916C_41_1.png
../_images/notebooks_grb080916C_41_2.png
../_images/notebooks_grb080916C_41_3.png
../_images/notebooks_grb080916C_41_4.png
../_images/notebooks_grb080916C_41_5.png
../_images/notebooks_grb080916C_41_6.png
../_images/notebooks_grb080916C_41_7.png
../_images/notebooks_grb080916C_41_8.png
../_images/notebooks_grb080916C_41_9.png
../_images/notebooks_grb080916C_41_10.png
../_images/notebooks_grb080916C_41_11.png

Finally, we can plot the models together to see how the spectra evolve with time.

[25]:
fig = plot_spectra(
    *[a.results for a in analysis[::1]],
    flux_unit="erg2/(cm2 s keV)",
    fit_cmap="viridis",
    contour_cmap="viridis",
    contour_style_kwargs=dict(alpha=0.1),
)
../_images/notebooks_grb080916C_43_14.png

This example can serve as a template for performing analysis on GBM data. However, as 3ML provides an abstract interface and modular building blocks, similar analysis pipelines can be built for any time series data.