Analyzing GRB 080916C

Alt text (NASA/Swift/Cruz deWilde)

To demonstrate the capabilities and features of 3ML in, we will go through a time-integrated and time-resolved analysis. This example serves as a standard way to analyze Fermi-GBM data with 3ML as well as a template for how you can design your instrument’s analysis pipeline with 3ML if you have similar data.

3ML provides utilities to reduce time series data to plugins in a correct and statistically justified way (e.g., background fitting of Poisson data is done with a Poisson likelihood). The approach is generic and can be extended. For more details, see the time series documentation.

[2]:
%%capture
import matplotlib.pyplot as plt
import numpy as np

np.seterr(all="ignore")


from threeML import *
from threeML.io.package_data import get_path_of_data_file

Examining the catalog

As with Swift and Fermi-LAT, 3ML provides a simple interface to the on-line Fermi-GBM catalog. Let’s get the information for GRB 080916C.

[4]:
gbm_catalog = FermiGBMBurstCatalog()
gbm_catalog.query_sources("GRB080916009")
[4]:
Table length=1
nameradectrigger_timet90
objectfloat64float64float64float64
GRB080916009119.800-56.60054725.008861362.977

To aid in quickly replicating the catalog analysis, and thanks to the tireless efforts of the Fermi-GBM team, we have added the ability to extract the analysis parameters from the catalog as well as build an astromodels model with the best fit parameters baked in. Using this information, one can quickly run through the catalog an replicate the entire analysis with a script. Let’s give it a try.

[5]:
grb_info = gbm_catalog.get_detector_information()["GRB080916009"]

gbm_detectors = grb_info["detectors"]
source_interval = grb_info["source"]["fluence"]
background_interval = grb_info["background"]["full"]
best_fit_model = grb_info["best fit model"]["fluence"]
model = gbm_catalog.get_model(best_fit_model, "fluence")["GRB080916009"]
[6]:
model
[6]:
Model summary:

N
Point sources 1
Extended sources 0
Particle sources 0


Free parameters (5):

value min_value max_value unit
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.K 0.012255 0.0 None keV-1 s-1 cm-2
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.alpha -1.130424 -1.5 2.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_energy 309.2031 10.0 None keV
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_scale 0.3 0.0 10.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.beta -2.096931 -5.0 -1.6


Fixed parameters (3):
(abridged. Use complete=True to see all fixed parameters)


Linked parameters (0):

(none)

Independent variables:

(none)

Downloading the data

We provide a simple interface to download the Fermi-GBM data. Using the information from the catalog that we have extracted, we can download just the data from the detectors that were used for the catalog analysis. This will download the CSPEC, TTE and instrument response files from the on-line database.

[7]:
dload = download_GBM_trigger_data("bn080916009", detectors=gbm_detectors)

Let’s first examine the catalog fluence fit. Using the TimeSeriesBuilder, we can fit the background, set the source interval, and create a 3ML plugin for the analysis. We will loop through the detectors, set their appropriate channel selections, and ensure there are enough counts in each bin to make the PGStat profile likelihood valid.

  • First we use the CSPEC data to fit the background using the background selections. We use CSPEC because it has a longer duration for fitting the background.

  • The background is saved to an HDF5 file that stores the polynomial coefficients and selections which we can read in to the TTE file later.

  • The light curve is plotted.

  • The source selection from the catalog is set and DispersionSpectrumLike plugin is created.

  • The plugin has the standard GBM channel selections for spectral analysis set.

[8]:
fluence_plugins = []
time_series = {}
for det in gbm_detectors:

    ts_cspec = TimeSeriesBuilder.from_gbm_cspec_or_ctime(
        det, cspec_or_ctime_file=dload[det]["cspec"], rsp_file=dload[det]["rsp"]
    )

    ts_cspec.set_background_interval(*background_interval.split(","))
    ts_cspec.save_background(f"{det}_bkg.h5", overwrite=True)

    ts_tte = TimeSeriesBuilder.from_gbm_tte(
        det,
        tte_file=dload[det]["tte"],
        rsp_file=dload[det]["rsp"],
        restore_background=f"{det}_bkg.h5",
    )

    time_series[det] = ts_tte

    ts_tte.set_active_time_interval(source_interval)

    ts_tte.view_lightcurve(-40, 100)

    fluence_plugin = ts_tte.to_spectrumlike()

    if det.startswith("b"):

        fluence_plugin.set_active_measurements("250-30000")

    else:

        fluence_plugin.set_active_measurements("9-900")

    fluence_plugin.rebin_on_background(1.0)

    fluence_plugins.append(fluence_plugin)
../_images/notebooks_grb080916C_12_9.png
../_images/notebooks_grb080916C_12_10.png
../_images/notebooks_grb080916C_12_11.png

Setting up the fit

Let’s see if we can reproduce the results from the catalog.

Set priors for the model

We will fit the spectrum using Bayesian analysis, so we must set priors on the model parameters.

[9]:
model.GRB080916009.spectrum.main.shape.alpha.prior = Truncated_gaussian(
    lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.beta.prior = Truncated_gaussian(
    lower_bound=-5, upper_bound=-1.6, mu=-2.25, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.break_energy.prior = Log_normal(mu=2, sigma=1)
model.GRB080916009.spectrum.main.shape.break_energy.bounds = (None, None)
model.GRB080916009.spectrum.main.shape.K.prior = Log_uniform_prior(
    lower_bound=1e-3, upper_bound=1e1
)
model.GRB080916009.spectrum.main.shape.break_scale.prior = Log_uniform_prior(
    lower_bound=1e-4, upper_bound=10
)

Clone the model and setup the Bayesian analysis class

Next, we clone the model we built from the catalog so that we can look at the results later and fit the cloned model. We pass this model and the DataList of the plugins to a BayesianAnalysis class and set the sampler to MultiNest.

[10]:
new_model = clone_model(model)

bayes = BayesianAnalysis(new_model, DataList(*fluence_plugins))

# share spectrum gives a linear speed up when
# spectrumlike plugins have the same RSP input energies
bayes.set_sampler("multinest", share_spectrum=True)

Examine at the catalog fitted model

We can quickly examine how well the catalog fit matches the data. There appears to be a discrepancy between the data and the model! Let’s refit to see if we can fix it.

[11]:
fig = display_spectrum_model_counts(bayes, min_rate=20, step=False)
../_images/notebooks_grb080916C_18_0.png

Run the sampler

We let MultiNest condition the model on the data

[12]:
bayes.sampler.setup(n_live_points=400)
bayes.sample()
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
GRB080916009...K (1.473 -0.018 +0.017) x 10^-2 1 / (cm2 keV s)
GRB080916009...alpha -1.070 +/- 0.017
GRB080916009...break_energy (2.21 +/- 0.25) x 10^2 keV
GRB080916009...break_scale (2.1 -0.7 +0.8) x 10^-1
GRB080916009...beta -2.14 +/- 0.09

Values of -log(posterior) at the minimum:

-log(posterior)
b0 -1051.683952
n3 -1023.415834
n4 -1014.315825
total -3089.415611

Values of statistical measures:

statistical measures
AIC 6189.001677
BIC 6208.233887
DIC 6175.798961
PDIC 4.202800
log(Z) -1347.866259
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  400
 dimensionality =    5
 *****************************************************
 ln(ev)=  -3103.5767542977746      +/-  0.22628974168447979
 Total Likelihood Evaluations:        21168
 Sampling finished. Exiting MultiNest

Now our model seems to match much better with the data!

[13]:
bayes.restore_median_fit()
fig = display_spectrum_model_counts(bayes, min_rate=20)
../_images/notebooks_grb080916C_22_0.png

But how different are we from the catalog model? Let’s plot our fit along with the catalog model. Luckily, 3ML can handle all the units for is

[14]:
conversion = u.Unit("keV2/(cm2 s keV)").to("erg2/(cm2 s keV)")
energy_grid = np.logspace(1, 4, 100) * u.keV
vFv = (energy_grid ** 2 * model.get_point_source_fluxes(0, energy_grid)).to(
    "erg2/(cm2 s keV)"
)
[15]:
fig = plot_spectra(bayes.results, flux_unit="erg2/(cm2 s keV)")
ax = fig.get_axes()[0]
_ = ax.loglog(energy_grid, vFv, color="blue", label="catalog model")
../_images/notebooks_grb080916C_25_3.png

Time Resolved Analysis

Now that we have examined fluence fit, we can move to performing a time-resolved analysis.

Selecting a temporal binning

We first get the brightest NaI detector and create time bins via the Bayesian blocks algorithm. We can use the fitted background to make sure that our intervals are chosen in an unbiased way.

[16]:
n3 = time_series["n3"]
[17]:
n3.create_time_bins(0, 60, method="bayesblocks", use_background=True, p0=0.2)

Sometimes, glitches in the GBM data cause spikes in the data that the Bayesian blocks algorithm detects as fast changes in the count rate. We will have to remove those intervals manually.

Note: In the future, 3ML will provide an automated method to remove these unwanted spikes.

[18]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_30_0.png
[19]:
bad_bins = []
for i, w in enumerate(n3.bins.widths):

    if w < 5e-2:
        bad_bins.append(i)


edges = [n3.bins.starts[0]]

for i, b in enumerate(n3.bins):

    if i not in bad_bins:
        edges.append(b.stop)

starts = edges[:-1]
stops = edges[1:]


n3.create_time_bins(starts, stops, method="custom")

Now our light curve looks much more acceptable.

[20]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_33_0.png

The time series objects can read time bins from each other, so we will map these time bins onto the other detectors’ time series and create a list of time plugins for each detector and each time bin created above.

[21]:
time_resolved_plugins = {}

for k, v in time_series.items():
    v.read_bins(n3)
    time_resolved_plugins[k] = v.to_spectrumlike(from_bins=True)

Setting up the model

For the time-resolved analysis, we will fit the classic Band function to the data. We will set some principled priors.

[22]:
band = Band()
band.alpha.prior = Truncated_gaussian(lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5)
band.beta.prior = Truncated_gaussian(lower_bound=-5, upper_bound=-1.6, mu=-2, sigma=0.5)
band.xp.prior = Log_normal(mu=2, sigma=1)
band.xp.bounds = (0, None)
band.K.prior = Log_uniform_prior(lower_bound=1e-10, upper_bound=1e3)
ps = PointSource("grb", 0, 0, spectral_shape=band)
band_model = Model(ps)

Perform the fits

One way to perform Bayesian spectral fits to all the intervals is to loop through each one. There can are many ways to do this, so find an analysis pattern that works for you.

[23]:
models = []
results = []
analysis = []
for interval in range(12):

    # clone the model above so that we have a separate model
    # for each fit

    this_model = clone_model(band_model)

    # for each detector set up the plugin
    # for this time interval

    this_data_list = []
    for k, v in time_resolved_plugins.items():

        pi = v[interval]

        if k.startswith("b"):
            pi.set_active_measurements("250-30000")
        else:
            pi.set_active_measurements("9-900")

        pi.rebin_on_background(1.0)

        this_data_list.append(pi)

    # create a data list

    dlist = DataList(*this_data_list)

    # set up the sampler and fit

    bayes = BayesianAnalysis(this_model, dlist)

    # get some speed with share spectrum
    bayes.set_sampler("multinest", share_spectrum=True)
    bayes.sampler.setup(n_live_points=500)
    bayes.sample()

    # at this stage we coudl also
    # save the analysis result to
    # disk but we will simply hold
    # onto them in memory

    analysis.append(bayes)
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.8 +/- 0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-4.9 -1.2 +1.3) x 10^-1
grb.spectrum.main.Band.xp (3.1 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.23 -0.27 +0.26

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval0 -280.530920
n3_interval0 -245.036015
n4_interval0 -261.458502
total -787.025437

Values of statistical measures:

statistical measures
AIC 1582.164189
BIC 1597.573006
DIC 1561.121298
PDIC 2.604364
log(Z) -343.038057
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (4.17 -0.18 +0.19) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.37 -0.30 +0.34) x 10^-1
grb.spectrum.main.Band.xp (6.1 -0.8 +0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.41 -0.14 +0.22

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval1 -666.731329
n3_interval1 -634.420252
n4_interval1 -637.491930
total -1938.643512

Values of statistical measures:

statistical measures
AIC 3885.400337
BIC 3900.809155
DIC 3861.591038
PDIC 2.957982
log(Z) -845.696553
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.23 -0.20 +0.25) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.1 +/- 0.4) x 10^-1
grb.spectrum.main.Band.xp (3.8 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.48 -0.09 +0.13

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval2 -317.114224
n3_interval2 -282.758346
n4_interval2 -308.848720
total -908.721291

Values of statistical measures:

statistical measures
AIC 1825.555896
BIC 1840.964713
DIC 1803.994519
PDIC 2.041285
log(Z) -397.885413
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.0 -0.4 +0.5) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.2 -1.1 +1.3) x 10^-1
grb.spectrum.main.Band.xp (3.4 +/- 0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.47 -0.4 +0.29

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval3 -292.185280
n3_interval3 -237.258490
n4_interval3 -257.187913
total -786.631683

Values of statistical measures:

statistical measures
AIC 1581.376681
BIC 1596.785499
DIC 1559.851674
PDIC 2.888791
log(Z) -342.407341
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.13 +/- 0.09) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.54 -0.32 +0.30) x 10^-1
grb.spectrum.main.Band.xp (3.70 +/- 0.32) x 10^2 keV
grb.spectrum.main.Band.beta -2.01 -0.09 +0.08

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval4 -773.605878
n3_interval4 -751.298751
n4_interval4 -741.137085
total -2266.041714

Values of statistical measures:

statistical measures
AIC 4540.196743
BIC 4555.605561
DIC 4519.680223
PDIC 3.099649
log(Z) -987.330851
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.74 -0.19 +0.18) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.3 -0.5 +0.4) x 10^-1
grb.spectrum.main.Band.xp (4.5 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.55 +/- 0.22

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval5 -532.085940
n3_interval5 -518.269236
n4_interval5 -522.146663
total -1572.501840

Values of statistical measures:

statistical measures
AIC 3153.116994
BIC 3168.525812
DIC 3131.090533
PDIC 2.844994
log(Z) -685.413873
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.30 -0.08 +0.14) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.02 -0.28 +0.4) x 10^-1
grb.spectrum.main.Band.xp (3.23 -0.31 +0.15) x 10^2 keV
grb.spectrum.main.Band.beta -2.13 -0.05 +0.09

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval6 -607.156428
n3_interval6 -578.832863
n4_interval6 -570.441709
total -1756.431000

Values of statistical measures:

statistical measures
AIC 3520.975314
BIC 3536.384132
DIC 3502.132929
PDIC 2.818063
log(Z) -766.915382
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.32 -0.07 +0.05) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.82 -0.17 +0.15) x 10^-1
grb.spectrum.main.Band.xp (2.36 +/- 0.06) x 10^2 keV
grb.spectrum.main.Band.beta -2.410 +/- 0.016

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval7 -660.374068
n3_interval7 -646.548731
n4_interval7 -644.210593
total -1951.133393

Values of statistical measures:

statistical measures
AIC 3910.380100
BIC 3925.788918
DIC 3891.067616
PDIC 0.682921
log(Z) -853.553170
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.58 +/- 0.12) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.2 +/- 0.6) x 10^-1
grb.spectrum.main.Band.xp (3.6 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -2.52 -0.28 +0.27

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval8 -697.025023
n3_interval8 -693.572311
n4_interval8 -660.865529
total -2051.462863

Values of statistical measures:

statistical measures
AIC 4111.039040
BIC 4126.447857
DIC 4090.582045
PDIC 3.277601
log(Z) -893.018078
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.4 +/- 0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.0 -2.3 +2.4) x 10^-1
grb.spectrum.main.Band.xp (1.2 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -2.10 -0.28 +0.27

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval9 -646.828739
n3_interval9 -614.988472
n4_interval9 -613.828748
total -1875.645960

Values of statistical measures:

statistical measures
AIC 3759.405235
BIC 3774.814052
DIC 3724.675508
PDIC -21.056162
log(Z) -816.813534
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.26 -0.35 +0.31) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-7.3 -1.0 +0.7) x 10^-1
grb.spectrum.main.Band.xp (1.93 -0.34 +0.30) x 10^2 keV
grb.spectrum.main.Band.beta -1.91 -0.09 +0.11

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval10 -457.260282
n3_interval10 -434.078593
n4_interval10 -428.694623
total -1320.033497

Values of statistical measures:

statistical measures
AIC 2648.180309
BIC 2663.589126
DIC 2630.958808
PDIC 1.296554
log(Z) -576.356297
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.5 -1.5 +1.7) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-4.3 -2.8 +2.9) x 10^-1
grb.spectrum.main.Band.xp (1.30 +/- 0.29) x 10^2 keV
grb.spectrum.main.Band.beta -2.31 +/- 0.34

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval11 -289.637346
n3_interval11 -269.042026
n4_interval11 -252.511825
total -811.191197

Values of statistical measures:

statistical measures
AIC 1630.495708
BIC 1645.904525
DIC 1611.937526
PDIC -0.655920
log(Z) -353.342202
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -789.87431751110967      +/-  0.17574330556097273
 Total Likelihood Evaluations:        17113
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1947.2882769388550      +/-  0.22003699561034681
 Total Likelihood Evaluations:        23524
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -916.16502134446000      +/-  0.20164341838537120
 Total Likelihood Evaluations:        20405
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -788.42204018991049      +/-  0.17173237406343630
 Total Likelihood Evaluations:        17610
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2273.4132989115965      +/-  0.19885517703529479
 Total Likelihood Evaluations:        20956
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1578.2237661848390      +/-  0.19778014682703157
 Total Likelihood Evaluations:        18756
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1765.8879269149984      +/-  0.20310594770120488
 Total Likelihood Evaluations:        21070
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1965.3788061706084      +/-  0.22006898485645637
 Total Likelihood Evaluations:        19074
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2056.2501148128754      +/-  0.18665350283847348
 Total Likelihood Evaluations:        20436
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1880.7826673977941      +/-  0.14370716363961272
 Total Likelihood Evaluations:        13232
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1327.1094172664107      +/-  0.17707279308793208
 Total Likelihood Evaluations:        15908
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -813.60048737778311      +/-  0.14576264452360096
 Total Likelihood Evaluations:        12982
 Sampling finished. Exiting MultiNest

Examine the fits

Now we can look at the fits in count space to make sure they are ok.

[24]:
for a in analysis:
    a.restore_median_fit()
    _ = display_spectrum_model_counts(a, min_rate=[20, 20, -99], step=False)
../_images/notebooks_grb080916C_41_0.png
../_images/notebooks_grb080916C_41_1.png
../_images/notebooks_grb080916C_41_2.png
../_images/notebooks_grb080916C_41_3.png
../_images/notebooks_grb080916C_41_4.png
../_images/notebooks_grb080916C_41_5.png
../_images/notebooks_grb080916C_41_6.png
../_images/notebooks_grb080916C_41_7.png
../_images/notebooks_grb080916C_41_8.png
../_images/notebooks_grb080916C_41_9.png
../_images/notebooks_grb080916C_41_10.png
../_images/notebooks_grb080916C_41_11.png

Finally, we can plot the models together to see how the spectra evolve with time.

[25]:
fig = plot_spectra(
    *[a.results for a in analysis[::1]],
    flux_unit="erg2/(cm2 s keV)",
    fit_cmap="viridis",
    contour_cmap="viridis",
    contour_style_kwargs=dict(alpha=0.1),
)
../_images/notebooks_grb080916C_43_14.png

This example can serve as a template for performing analysis on GBM data. However, as 3ML provides an abstract interface and modular building blocks, similar analysis pipelines can be built for any time series data.