Analyzing GRB 080916C

Alt text (NASA/Swift/Cruz deWilde)

To demonstrate the capabilities and features of 3ML in, we will go through a time-integrated and time-resolved analysis. This example serves as a standard way to analyze Fermi-GBM data with 3ML as well as a template for how you can design your instrument’s analysis pipeline with 3ML if you have similar data.

3ML provides utilities to reduce time series data to plugins in a correct and statistically justified way (e.g., background fitting of Poisson data is done with a Poisson likelihood). The approach is generic and can be extended. For more details, see the time series documentation.

[2]:
%%capture
import matplotlib.pyplot as plt
import numpy as np

np.seterr(all="ignore")


from threeML import *
from threeML.io.package_data import get_path_of_data_file

Examining the catalog

As with Swift and Fermi-LAT, 3ML provides a simple interface to the on-line Fermi-GBM catalog. Let’s get the information for GRB 080916C.

[4]:
gbm_catalog = FermiGBMBurstCatalog()
gbm_catalog.query_sources("GRB080916009")
[4]:
Table length=1
nameradectrigger_timet90
objectfloat64float64float64float64
GRB080916009119.800-56.60054725.008861362.977

To aid in quickly replicating the catalog analysis, and thanks to the tireless efforts of the Fermi-GBM team, we have added the ability to extract the analysis parameters from the catalog as well as build an astromodels model with the best fit parameters baked in. Using this information, one can quickly run through the catalog an replicate the entire analysis with a script. Let’s give it a try.

[5]:
grb_info = gbm_catalog.get_detector_information()["GRB080916009"]

gbm_detectors = grb_info["detectors"]
source_interval = grb_info["source"]["fluence"]
background_interval = grb_info["background"]["full"]
best_fit_model = grb_info["best fit model"]["fluence"]
model = gbm_catalog.get_model(best_fit_model, "fluence")["GRB080916009"]
[6]:
model
[6]:
Model summary:

N
Point sources 1
Extended sources 0
Particle sources 0


Free parameters (5):

value min_value max_value unit
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.K 0.012255 0.0 None keV-1 s-1 cm-2
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.alpha -1.130424 -1.5 2.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_energy 309.2031 10.0 None keV
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_scale 0.3 0.0 10.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.beta -2.096931 -5.0 -1.6


Fixed parameters (3):
(abridged. Use complete=True to see all fixed parameters)


Linked parameters (0):

(none)

Independent variables:

(none)

Downloading the data

We provide a simple interface to download the Fermi-GBM data. Using the information from the catalog that we have extracted, we can download just the data from the detectors that were used for the catalog analysis. This will download the CSPEC, TTE and instrument response files from the on-line database.

[7]:
dload = download_GBM_trigger_data("bn080916009", detectors=gbm_detectors)

Let’s first examine the catalog fluence fit. Using the TimeSeriesBuilder, we can fit the background, set the source interval, and create a 3ML plugin for the analysis. We will loop through the detectors, set their appropriate channel selections, and ensure there are enough counts in each bin to make the PGStat profile likelihood valid.

  • First we use the CSPEC data to fit the background using the background selections. We use CSPEC because it has a longer duration for fitting the background.

  • The background is saved to an HDF5 file that stores the polynomial coefficients and selections which we can read in to the TTE file later.

  • The light curve is plotted.

  • The source selection from the catalog is set and DispersionSpectrumLike plugin is created.

  • The plugin has the standard GBM channel selections for spectral analysis set.

[8]:
fluence_plugins = []
time_series = {}
for det in gbm_detectors:

    ts_cspec = TimeSeriesBuilder.from_gbm_cspec_or_ctime(
        det, cspec_or_ctime_file=dload[det]["cspec"], rsp_file=dload[det]["rsp"]
    )

    ts_cspec.set_background_interval(*background_interval.split(","))
    ts_cspec.save_background(f"{det}_bkg.h5", overwrite=True)

    ts_tte = TimeSeriesBuilder.from_gbm_tte(
        det,
        tte_file=dload[det]["tte"],
        rsp_file=dload[det]["rsp"],
        restore_background=f"{det}_bkg.h5",
    )

    time_series[det] = ts_tte

    ts_tte.set_active_time_interval(source_interval)

    ts_tte.view_lightcurve(-40, 100)

    fluence_plugin = ts_tte.to_spectrumlike()

    if det.startswith("b"):

        fluence_plugin.set_active_measurements("250-30000")

    else:

        fluence_plugin.set_active_measurements("9-900")

    fluence_plugin.rebin_on_background(1.0)

    fluence_plugins.append(fluence_plugin)
../_images/notebooks_grb080916C_12_9.png
../_images/notebooks_grb080916C_12_10.png
../_images/notebooks_grb080916C_12_11.png

Setting up the fit

Let’s see if we can reproduce the results from the catalog.

Set priors for the model

We will fit the spectrum using Bayesian analysis, so we must set priors on the model parameters.

[9]:
model.GRB080916009.spectrum.main.shape.alpha.prior = Truncated_gaussian(
    lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.beta.prior = Truncated_gaussian(
    lower_bound=-5, upper_bound=-1.6, mu=-2.25, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.break_energy.prior = Log_normal(mu=2, sigma=1)
model.GRB080916009.spectrum.main.shape.break_energy.bounds = (None, None)
model.GRB080916009.spectrum.main.shape.K.prior = Log_uniform_prior(
    lower_bound=1e-3, upper_bound=1e1
)
model.GRB080916009.spectrum.main.shape.break_scale.prior = Log_uniform_prior(
    lower_bound=1e-4, upper_bound=10
)

Clone the model and setup the Bayesian analysis class

Next, we clone the model we built from the catalog so that we can look at the results later and fit the cloned model. We pass this model and the DataList of the plugins to a BayesianAnalysis class and set the sampler to MultiNest.

[10]:
new_model = clone_model(model)

bayes = BayesianAnalysis(new_model, DataList(*fluence_plugins))

# share spectrum gives a linear speed up when
# spectrumlike plugins have the same RSP input energies
bayes.set_sampler("multinest", share_spectrum=True)

Examine at the catalog fitted model

We can quickly examine how well the catalog fit matches the data. There appears to be a discrepancy between the data and the model! Let’s refit to see if we can fix it.

[11]:
fig = display_spectrum_model_counts(bayes, min_rate=20, step=False)
../_images/notebooks_grb080916C_18_0.png

Run the sampler

We let MultiNest condition the model on the data

[12]:
bayes.sampler.setup(n_live_points=400)
bayes.sample()
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
GRB080916009...K (1.498 -0.028 +0.022) x 10^-2 1 / (cm2 keV s)
GRB080916009...alpha -1.044 -0.020 +0.017
GRB080916009...break_energy (1.91 -0.09 +0.08) x 10^2 keV
GRB080916009...break_scale (2.4 +/- 0.4) x 10^-1
GRB080916009...beta -2.03 +/- 0.04

Values of -log(posterior) at the minimum:

-log(posterior)
b0 -1051.987075
n3 -1025.313589
n4 -1012.984096
total -3090.284761

Values of statistical measures:

statistical measures
AIC 6190.739976
BIC 6209.972186
DIC 6181.004060
PDIC 4.014802
log(Z) -1350.509912
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  400
 dimensionality =    5
 *****************************************************
 ln(ev)=  -3109.6639911822522      +/-  0.24408224588693866
 Total Likelihood Evaluations:        22982
 Sampling finished. Exiting MultiNest

Now our model seems to match much better with the data!

[13]:
bayes.restore_median_fit()
fig = display_spectrum_model_counts(bayes, min_rate=20)
../_images/notebooks_grb080916C_22_0.png

But how different are we from the catalog model? Let’s plot our fit along with the catalog model. Luckily, 3ML can handle all the units for is

[14]:
conversion = u.Unit("keV2/(cm2 s keV)").to("erg2/(cm2 s keV)")
energy_grid = np.logspace(1, 4, 100) * u.keV
vFv = (energy_grid ** 2 * model.get_point_source_fluxes(0, energy_grid)).to(
    "erg2/(cm2 s keV)"
)
[15]:
fig = plot_spectra(bayes.results, flux_unit="erg2/(cm2 s keV)")
ax = fig.get_axes()[0]
_ = ax.loglog(energy_grid, vFv, color="blue", label="catalog model")
../_images/notebooks_grb080916C_25_3.png

Time Resolved Analysis

Now that we have examined fluence fit, we can move to performing a time-resolved analysis.

Selecting a temporal binning

We first get the brightest NaI detector and create time bins via the Bayesian blocks algorithm. We can use the fitted background to make sure that our intervals are chosen in an unbiased way.

[16]:
n3 = time_series["n3"]
[17]:
n3.create_time_bins(0, 60, method="bayesblocks", use_background=True, p0=0.2)

Sometimes, glitches in the GBM data cause spikes in the data that the Bayesian blocks algorithm detects as fast changes in the count rate. We will have to remove those intervals manually.

Note: In the future, 3ML will provide an automated method to remove these unwanted spikes.

[18]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_30_0.png
[19]:
bad_bins = []
for i, w in enumerate(n3.bins.widths):

    if w < 5e-2:
        bad_bins.append(i)


edges = [n3.bins.starts[0]]

for i, b in enumerate(n3.bins):

    if i not in bad_bins:
        edges.append(b.stop)

starts = edges[:-1]
stops = edges[1:]


n3.create_time_bins(starts, stops, method="custom")

Now our light curve looks much more acceptable.

[20]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_33_0.png

The time series objects can read time bins from each other, so we will map these time bins onto the other detectors’ time series and create a list of time plugins for each detector and each time bin created above.

[21]:
time_resolved_plugins = {}

for k, v in time_series.items():
    v.read_bins(n3)
    time_resolved_plugins[k] = v.to_spectrumlike(from_bins=True)

Setting up the model

For the time-resolved analysis, we will fit the classic Band function to the data. We will set some principled priors.

[22]:
band = Band()
band.alpha.prior = Truncated_gaussian(lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5)
band.beta.prior = Truncated_gaussian(lower_bound=-5, upper_bound=-1.6, mu=-2, sigma=0.5)
band.xp.prior = Log_normal(mu=2, sigma=1)
band.xp.bounds = (0, None)
band.K.prior = Log_uniform_prior(lower_bound=1e-10, upper_bound=1e3)
ps = PointSource("grb", 0, 0, spectral_shape=band)
band_model = Model(ps)

Perform the fits

One way to perform Bayesian spectral fits to all the intervals is to loop through each one. There can are many ways to do this, so find an analysis pattern that works for you.

[23]:
models = []
results = []
analysis = []
for interval in range(12):

    # clone the model above so that we have a separate model
    # for each fit

    this_model = clone_model(band_model)

    # for each detector set up the plugin
    # for this time interval

    this_data_list = []
    for k, v in time_resolved_plugins.items():

        pi = v[interval]

        if k.startswith("b"):
            pi.set_active_measurements("250-30000")
        else:
            pi.set_active_measurements("9-900")

        pi.rebin_on_background(1.0)

        this_data_list.append(pi)

    # create a data list

    dlist = DataList(*this_data_list)

    # set up the sampler and fit

    bayes = BayesianAnalysis(this_model, dlist)

    # get some speed with share spectrum
    bayes.set_sampler("multinest", share_spectrum=True)
    bayes.sampler.setup(n_live_points=500)
    bayes.sample()

    # at this stage we coudl also
    # save the analysis result to
    # disk but we will simply hold
    # onto them in memory

    analysis.append(bayes)
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.7 -0.6 +0.5) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-5.4 -1.4 +1.2) x 10^-1
grb.spectrum.main.Band.xp (3.1 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.11 -0.20 +0.19

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval0 -280.486883
n3_interval0 -244.970721
n4_interval0 -261.515821
total -786.973425

Values of statistical measures:

statistical measures
AIC 1582.060164
BIC 1597.468982
DIC 1560.763694
PDIC 2.453514
log(Z) -343.134610
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (4.13 +/- 0.12) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.56 +/- 0.23) x 10^-1
grb.spectrum.main.Band.xp (6.2 -0.4 +0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.24 +/- 0.11

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval1 -665.287277
n3_interval1 -634.682638
n4_interval1 -637.939419
total -1937.909334

Values of statistical measures:

statistical measures
AIC 3883.931983
BIC 3899.340800
DIC 3858.576427
PDIC 3.533309
log(Z) -844.218040
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.47 -0.19 +0.16) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.10 -0.06 +0.05
grb.spectrum.main.Band.xp (7.3 -1.4 +1.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.79 -0.21 +0.18

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval2 -317.067451
n3_interval2 -283.179039
n4_interval2 -306.033704
total -906.280194

Values of statistical measures:

statistical measures
AIC 1820.673703
BIC 1836.082520
DIC 1791.171269
PDIC 2.194083
log(Z) -394.599635
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.0 +/- 0.4) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.1 +/- 0.9) x 10^-1
grb.spectrum.main.Band.xp (3.4 -0.6 +0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.39 +/- 0.31

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval3 -291.968425
n3_interval3 -237.340442
n4_interval3 -257.297417
total -786.606285

Values of statistical measures:

statistical measures
AIC 1581.325884
BIC 1596.734701
DIC 1559.783820
PDIC 3.115336
log(Z) -342.429771
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.20 +/- 0.11) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.39 -0.33 +0.30) x 10^-1
grb.spectrum.main.Band.xp (3.41 +/- 0.29) x 10^2 keV
grb.spectrum.main.Band.beta -1.91 +/- 0.05

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval4 -773.646230
n3_interval4 -751.483709
n4_interval4 -741.029206
total -2266.159145

Values of statistical measures:

statistical measures
AIC 4540.431605
BIC 4555.840422
DIC 4520.702334
PDIC 2.562910
log(Z) -988.135027
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.81 +/- 0.19) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.0 +/- 0.5) x 10^-1
grb.spectrum.main.Band.xp (4.2 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.31 -0.23 +0.22

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval5 -531.858597
n3_interval5 -517.537067
n4_interval5 -522.807980
total -1572.203645

Values of statistical measures:

statistical measures
AIC 3152.520604
BIC 3167.929421
DIC 3130.695791
PDIC 3.227603
log(Z) -684.717691
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.00 -0.11 +0.10) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.9 +/- 0.4) x 10^-1
grb.spectrum.main.Band.xp (4.2 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.51 -0.26 +0.25

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval6 -607.082578
n3_interval6 -577.846084
n4_interval6 -570.915681
total -1755.844342

Values of statistical measures:

statistical measures
AIC 3519.801999
BIC 3535.210817
DIC 3496.147700
PDIC 2.855511
log(Z) -764.514106
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.69 +/- 0.11) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.03 +/- 0.05
grb.spectrum.main.Band.xp (4.2 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.51 +/- 0.30

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval7 -659.490075
n3_interval7 -635.045454
n4_interval7 -644.521211
total -1939.056739

Values of statistical measures:

statistical measures
AIC 3886.226793
BIC 3901.635610
DIC 3864.320731
PDIC 3.541201
log(Z) -843.853692
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.57 -0.12 +0.13) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.3 +/- 0.6) x 10^-1
grb.spectrum.main.Band.xp (3.6 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -2.54 -0.27 +0.28

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval8 -696.972770
n3_interval8 -693.602740
n4_interval8 -660.817064
total -2051.392574

Values of statistical measures:

statistical measures
AIC 4110.898462
BIC 4126.307279
DIC 4090.259485
PDIC 3.130072
log(Z) -893.040204
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.5 -0.8 +0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.2 -2.4 +2.6) x 10^-1
grb.spectrum.main.Band.xp (1.3 -0.5 +0.4) x 10^2 keV
grb.spectrum.main.Band.beta -2.10 -0.27 +0.26

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval9 -646.986849
n3_interval9 -615.032189
n4_interval9 -613.778048
total -1875.797087

Values of statistical measures:

statistical measures
AIC 3759.707488
BIC 3775.116306
DIC 3700.821801
PDIC -44.659574
log(Z) -816.938235
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.06 -0.4 +0.34) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-7.5 -1.3 +1.2) x 10^-1
grb.spectrum.main.Band.xp (2.4 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.28 -0.34 +0.35

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval10 -457.201772
n3_interval10 -433.789521
n4_interval10 -429.351658
total -1320.342951

Values of statistical measures:

statistical measures
AIC 2648.799217
BIC 2664.208034
DIC 2629.676519
PDIC 1.029227
log(Z) -575.162257
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.4 -1.4 +1.3) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-4.2 +/- 2.5) x 10^-1
grb.spectrum.main.Band.xp (1.30 +/- 0.28) x 10^2 keV
grb.spectrum.main.Band.beta -2.28 +/- 0.35

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval11 -289.638482
n3_interval11 -268.977505
n4_interval11 -252.516928
total -811.132915

Values of statistical measures:

statistical measures
AIC 1630.379145
BIC 1645.787962
DIC 1612.277178
PDIC -0.158506
log(Z) -353.306525
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -790.09663695390282      +/-  0.17719989801547933
 Total Likelihood Evaluations:        17478
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1943.8838745010687      +/-  0.21211099606632000
 Total Likelihood Evaluations:        22988
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -908.59923614364413      +/-  0.20454548501368133
 Total Likelihood Evaluations:        18959
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -788.47368633624774      +/-  0.17290761492935447
 Total Likelihood Evaluations:        18829
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2275.2649838186480      +/-  0.20344594902486149
 Total Likelihood Evaluations:        21860
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1576.6207481495717      +/-  0.18989033756921414
 Total Likelihood Evaluations:        20268
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1760.3587830699259      +/-  0.19437496185222045
 Total Likelihood Evaluations:        20377
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1943.0449313159691      +/-  0.18857036071330044
 Total Likelihood Evaluations:        22386
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2056.3010603729876      +/-  0.18727295354435483
 Total Likelihood Evaluations:        19153
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1881.0698008101397      +/-  0.14678664996363730
 Total Likelihood Evaluations:        12817
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1324.3600381002725      +/-  0.16690574997700650
 Total Likelihood Evaluations:        15389
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -813.51833729006478      +/-  0.14564191870412616
 Total Likelihood Evaluations:        12718
 Sampling finished. Exiting MultiNest

Examine the fits

Now we can look at the fits in count space to make sure they are ok.

[24]:
for a in analysis:
    a.restore_median_fit()
    _ = display_spectrum_model_counts(a, min_rate=[20, 20, -99], step=False)
../_images/notebooks_grb080916C_41_0.png
../_images/notebooks_grb080916C_41_1.png
../_images/notebooks_grb080916C_41_2.png
../_images/notebooks_grb080916C_41_3.png
../_images/notebooks_grb080916C_41_4.png
../_images/notebooks_grb080916C_41_5.png
../_images/notebooks_grb080916C_41_6.png
../_images/notebooks_grb080916C_41_7.png
../_images/notebooks_grb080916C_41_8.png
../_images/notebooks_grb080916C_41_9.png
../_images/notebooks_grb080916C_41_10.png
../_images/notebooks_grb080916C_41_11.png

Finally, we can plot the models together to see how the spectra evolve with time.

[25]:
fig = plot_spectra(
    *[a.results for a in analysis[::1]],
    flux_unit="erg2/(cm2 s keV)",
    fit_cmap="viridis",
    contour_cmap="viridis",
    contour_style_kwargs=dict(alpha=0.1),
)
../_images/notebooks_grb080916C_43_14.png

This example can serve as a template for performing analysis on GBM data. However, as 3ML provides an abstract interface and modular building blocks, similar analysis pipelines can be built for any time series data.