Analyzing GRB 080916C

Alt text (NASA/Swift/Cruz deWilde)

To demonstrate the capabilities and features of 3ML in, we will go through a time-integrated and time-resolved analysis. This example serves as a standard way to analyze Fermi-GBM data with 3ML as well as a template for how you can design your instrument’s analysis pipeline with 3ML if you have similar data.

3ML provides utilities to reduce time series data to plugins in a correct and statistically justified way (e.g., background fitting of Poisson data is done with a Poisson likelihood). The approach is generic and can be extended. For more details, see the time series documentation.

[1]:
import warnings

warnings.simplefilter("ignore")
[2]:
%%capture
import matplotlib.pyplot as plt
import numpy as np

np.seterr(all="ignore")


from threeML import *
from threeML.io.package_data import get_path_of_data_file
[3]:

silence_warnings() %matplotlib inline from jupyterthemes import jtplot jtplot.style(context="talk", fscale=1, ticks=True, grid=False) set_threeML_style()

Examining the catalog

As with Swift and Fermi-LAT, 3ML provides a simple interface to the on-line Fermi-GBM catalog. Let’s get the information for GRB 080916C.

[4]:
gbm_catalog = FermiGBMBurstCatalog()
gbm_catalog.query_sources("GRB080916009")
[4]:
Table length=1
nameradectrigger_timet90
objectfloat64float64float64float64
GRB080916009119.800-56.60054725.008861362.977

To aid in quickly replicating the catalog analysis, and thanks to the tireless efforts of the Fermi-GBM team, we have added the ability to extract the analysis parameters from the catalog as well as build an astromodels model with the best fit parameters baked in. Using this information, one can quickly run through the catalog an replicate the entire analysis with a script. Let’s give it a try.

[5]:
grb_info = gbm_catalog.get_detector_information()["GRB080916009"]

gbm_detectors = grb_info["detectors"]
source_interval = grb_info["source"]["fluence"]
background_interval = grb_info["background"]["full"]
best_fit_model = grb_info["best fit model"]["fluence"]
model = gbm_catalog.get_model(best_fit_model, "fluence")["GRB080916009"]
[6]:
model
[6]:
Model summary:

N
Point sources 1
Extended sources 0
Particle sources 0


Free parameters (5):

value min_value max_value unit
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.K 0.012255 0.0 None keV-1 s-1 cm-2
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.alpha -1.130424 -1.5 2.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_energy 309.2031 10.0 None keV
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.break_scale 0.3 0.0 10.0
GRB080916009.spectrum.main.SmoothlyBrokenPowerLaw.beta -2.096931 -5.0 -1.6


Fixed parameters (3):
(abridged. Use complete=True to see all fixed parameters)


Properties (0):

(none)


Linked parameters (0):

(none)

Independent variables:

(none)

Linked functions (0):

(none)

Downloading the data

We provide a simple interface to download the Fermi-GBM data. Using the information from the catalog that we have extracted, we can download just the data from the detectors that were used for the catalog analysis. This will download the CSPEC, TTE and instrument response files from the on-line database.

[7]:
dload = download_GBM_trigger_data("bn080916009", detectors=gbm_detectors)

Let’s first examine the catalog fluence fit. Using the TimeSeriesBuilder, we can fit the background, set the source interval, and create a 3ML plugin for the analysis. We will loop through the detectors, set their appropriate channel selections, and ensure there are enough counts in each bin to make the PGStat profile likelihood valid.

  • First we use the CSPEC data to fit the background using the background selections. We use CSPEC because it has a longer duration for fitting the background.

  • The background is saved to an HDF5 file that stores the polynomial coefficients and selections which we can read in to the TTE file later.

  • The light curve is plotted.

  • The source selection from the catalog is set and DispersionSpectrumLike plugin is created.

  • The plugin has the standard GBM channel selections for spectral analysis set.

[8]:
fluence_plugins = []
time_series = {}
for det in gbm_detectors:

    ts_cspec = TimeSeriesBuilder.from_gbm_cspec_or_ctime(
        det, cspec_or_ctime_file=dload[det]["cspec"], rsp_file=dload[det]["rsp"]
    )

    ts_cspec.set_background_interval(*background_interval.split(","))
    ts_cspec.save_background(f"{det}_bkg.h5", overwrite=True)

    ts_tte = TimeSeriesBuilder.from_gbm_tte(
        det,
        tte_file=dload[det]["tte"],
        rsp_file=dload[det]["rsp"],
        restore_background=f"{det}_bkg.h5",
    )

    time_series[det] = ts_tte

    ts_tte.set_active_time_interval(source_interval)

    ts_tte.view_lightcurve(-40, 100)

    fluence_plugin = ts_tte.to_spectrumlike()

    if det.startswith("b"):

        fluence_plugin.set_active_measurements("250-30000")

    else:

        fluence_plugin.set_active_measurements("9-900")

    fluence_plugin.rebin_on_background(1.0)

    fluence_plugins.append(fluence_plugin)
../_images/notebooks_grb080916C_12_9.png
../_images/notebooks_grb080916C_12_10.png
../_images/notebooks_grb080916C_12_11.png

Setting up the fit

Let’s see if we can reproduce the results from the catalog.

Set priors for the model

We will fit the spectrum using Bayesian analysis, so we must set priors on the model parameters.

[9]:
model.GRB080916009.spectrum.main.shape.alpha.prior = Truncated_gaussian(
    lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.beta.prior = Truncated_gaussian(
    lower_bound=-5, upper_bound=-1.6, mu=-2.25, sigma=0.5
)
model.GRB080916009.spectrum.main.shape.break_energy.prior = Log_normal(mu=2, sigma=1)
model.GRB080916009.spectrum.main.shape.break_energy.bounds = (None, None)
model.GRB080916009.spectrum.main.shape.K.prior = Log_uniform_prior(
    lower_bound=1e-3, upper_bound=1e1
)
model.GRB080916009.spectrum.main.shape.break_scale.prior = Log_uniform_prior(
    lower_bound=1e-4, upper_bound=10
)

Clone the model and setup the Bayesian analysis class

Next, we clone the model we built from the catalog so that we can look at the results later and fit the cloned model. We pass this model and the DataList of the plugins to a BayesianAnalysis class and set the sampler to MultiNest.

[10]:
new_model = clone_model(model)

bayes = BayesianAnalysis(new_model, DataList(*fluence_plugins))

# share spectrum gives a linear speed up when
# spectrumlike plugins have the same RSP input energies
bayes.set_sampler("multinest", share_spectrum=True)

Examine at the catalog fitted model

We can quickly examine how well the catalog fit matches the data. There appears to be a discrepancy between the data and the model! Let’s refit to see if we can fix it.

[11]:
fig = display_spectrum_model_counts(bayes, min_rate=20, step=False)
../_images/notebooks_grb080916C_18_0.png

Run the sampler

We let MultiNest condition the model on the data

[12]:
bayes.sampler.setup(n_live_points=400)
bayes.sample()
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
GRB080916009...K (1.470 +/- 0.018) x 10^-2 1 / (cm2 keV s)
GRB080916009...alpha -1.074 +/- 0.020
GRB080916009...break_energy (2.31 +/- 0.30) x 10^2 keV
GRB080916009...break_scale (2.5 +/- 0.9) x 10^-1
GRB080916009...beta -2.10 +/- 0.09

Values of -log(posterior) at the minimum:

-log(posterior)
b0 -1051.042287
n3 -1018.387837
n4 -1010.109013
total -3079.539137

Values of statistical measures:

statistical measures
AIC 6169.248728
BIC 6188.480938
DIC 6179.931873
PDIC 4.386599
log(Z) -1346.840562
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  400
 dimensionality =    5
 *****************************************************
 ln(ev)=  -3101.2150016316064      +/-  0.22446465867039864
 Total Likelihood Evaluations:        22405
 Sampling finished. Exiting MultiNest

Now our model seems to match much better with the data!

[13]:
bayes.restore_median_fit()
fig = display_spectrum_model_counts(bayes, min_rate=20)
../_images/notebooks_grb080916C_22_0.png

But how different are we from the catalog model? Let’s plot our fit along with the catalog model. Luckily, 3ML can handle all the units for is

[14]:
conversion = u.Unit("keV2/(cm2 s keV)").to("erg2/(cm2 s keV)")
energy_grid = np.logspace(1, 4, 100) * u.keV
vFv = (energy_grid**2 * model.get_point_source_fluxes(0, energy_grid)).to(
    "erg2/(cm2 s keV)"
)
[15]:
fig = plot_spectra(bayes.results, flux_unit="erg2/(cm2 s keV)")
ax = fig.get_axes()[0]
_ = ax.loglog(energy_grid, vFv, color="blue", label="catalog model")
../_images/notebooks_grb080916C_25_2.png

Time Resolved Analysis

Now that we have examined fluence fit, we can move to performing a time-resolved analysis.

Selecting a temporal binning

We first get the brightest NaI detector and create time bins via the Bayesian blocks algorithm. We can use the fitted background to make sure that our intervals are chosen in an unbiased way.

[16]:
n3 = time_series["n3"]
[17]:
n3.create_time_bins(0, 60, method="bayesblocks", use_background=True, p0=0.2)

Sometimes, glitches in the GBM data cause spikes in the data that the Bayesian blocks algorithm detects as fast changes in the count rate. We will have to remove those intervals manually.

Note: In the future, 3ML will provide an automated method to remove these unwanted spikes.

[18]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_30_0.png
[19]:
bad_bins = []
for i, w in enumerate(n3.bins.widths):

    if w < 5e-2:
        bad_bins.append(i)


edges = [n3.bins.starts[0]]

for i, b in enumerate(n3.bins):

    if i not in bad_bins:
        edges.append(b.stop)

starts = edges[:-1]
stops = edges[1:]


n3.create_time_bins(starts, stops, method="custom")

Now our light curve looks much more acceptable.

[20]:
fig = n3.view_lightcurve(use_binner=True)
../_images/notebooks_grb080916C_33_0.png

The time series objects can read time bins from each other, so we will map these time bins onto the other detectors’ time series and create a list of time plugins for each detector and each time bin created above.

[21]:
time_resolved_plugins = {}

for k, v in time_series.items():
    v.read_bins(n3)
    time_resolved_plugins[k] = v.to_spectrumlike(from_bins=True)

Setting up the model

For the time-resolved analysis, we will fit the classic Band function to the data. We will set some principled priors.

[22]:
band = Band()
band.alpha.prior = Truncated_gaussian(lower_bound=-1.5, upper_bound=1, mu=-1, sigma=0.5)
band.beta.prior = Truncated_gaussian(lower_bound=-5, upper_bound=-1.6, mu=-2, sigma=0.5)
band.xp.prior = Log_normal(mu=2, sigma=1)
band.xp.bounds = (0, None)
band.K.prior = Log_uniform_prior(lower_bound=1e-10, upper_bound=1e3)
ps = PointSource("grb", 0, 0, spectral_shape=band)
band_model = Model(ps)

Perform the fits

One way to perform Bayesian spectral fits to all the intervals is to loop through each one. There can are many ways to do this, so find an analysis pattern that works for you.

[23]:
models = []
results = []
analysis = []
for interval in range(12):

    # clone the model above so that we have a separate model
    # for each fit

    this_model = clone_model(band_model)

    # for each detector set up the plugin
    # for this time interval

    this_data_list = []
    for k, v in time_resolved_plugins.items():

        pi = v[interval]

        if k.startswith("b"):
            pi.set_active_measurements("250-30000")
        else:
            pi.set_active_measurements("9-900")

        pi.rebin_on_background(1.0)

        this_data_list.append(pi)

    # create a data list

    dlist = DataList(*this_data_list)

    # set up the sampler and fit

    bayes = BayesianAnalysis(this_model, dlist)

    # get some speed with share spectrum
    bayes.set_sampler("multinest", share_spectrum=True)
    bayes.sampler.setup(n_live_points=500)
    bayes.sample()

    # at this stage we coudl also
    # save the analysis result to
    # disk but we will simply hold
    # onto them in memory

    analysis.append(bayes)
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.4 -0.7 +1.2) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-6.5 -1.4 +2.5) x 10^-1
grb.spectrum.main.Band.xp (3.8 -1.3 +1.0) x 10^2 keV
grb.spectrum.main.Band.beta -2.20 -0.09 +0.08

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval0 -286.111393
n3_interval0 -251.187588
n4_interval0 -268.645400
total -805.944381

Values of statistical measures:

statistical measures
AIC 1620.002077
BIC 1635.410894
DIC 1569.151605
PDIC -0.218288
log(Z) -343.534834
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.93 -0.09 +0.08) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.01 -0.19 +0.17) x 10^-1
grb.spectrum.main.Band.xp (7.3 -0.4 +0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.43 -0.09 +0.08

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval1 -674.128303
n3_interval1 -643.383139
n4_interval1 -647.447322
total -1964.958764

Values of statistical measures:

statistical measures
AIC 3938.030842
BIC 3953.439659
DIC 3877.340128
PDIC 2.564232
log(Z) -845.511016
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.78 -0.23 +0.22) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.01 +/- 0.06
grb.spectrum.main.Band.xp (4.5 +/- 0.9) x 10^2 keV
grb.spectrum.main.Band.beta -1.73 +/- 0.06

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval2 -324.458477
n3_interval2 -289.107238
n4_interval2 -311.843606
total -925.409322

Values of statistical measures:

statistical measures
AIC 1858.931958
BIC 1874.340775
DIC 1804.597662
PDIC 2.157492
log(Z) -394.282574
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.9 +/- 0.4) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.3 +/- 0.9) x 10^-1
grb.spectrum.main.Band.xp (3.6 +/- 0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.38 -0.35 +0.33

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval3 -298.363620
n3_interval3 -242.372539
n4_interval3 -262.422612
total -803.158771

Values of statistical measures:

statistical measures
AIC 1614.430856
BIC 1629.839673
DIC 1570.972638
PDIC 3.198839
log(Z) -342.182046
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.97 -0.09 +0.08) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.009 -0.031 +0.030
grb.spectrum.main.Band.xp (4.5 -0.4 +0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.06 -0.09 +0.10

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval4 -778.794792
n3_interval4 -756.792149
n4_interval4 -747.316002
total -2282.902943

Values of statistical measures:

statistical measures
AIC 4573.919200
BIC 4589.328017
DIC 4528.380261
PDIC 3.154066
log(Z) -986.546163
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.80 -0.18 +0.17) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-9.1 -0.5 +0.4) x 10^-1
grb.spectrum.main.Band.xp (4.3 -0.5 +0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.25 +/- 0.21

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval5 -537.005712
n3_interval5 -523.564815
n4_interval5 -527.561072
total -1588.131600

Values of statistical measures:

statistical measures
AIC 3184.376514
BIC 3199.785331
DIC 3136.555995
PDIC 3.205243
log(Z) -683.284213
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.14 -0.15 +0.04) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.025 -0.05 +0.018
grb.spectrum.main.Band.xp (3.22 -0.24 +0.4) x 10^2 keV
grb.spectrum.main.Band.beta -1.724 -0.015 +0.011

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval6 -617.693661
n3_interval6 -585.304717
n4_interval6 -576.315437
total -1779.313815

Values of statistical measures:

statistical measures
AIC 3566.740944
BIC 3582.149761
DIC 3524.966113
PDIC 0.814767
log(Z) -770.434544
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.66 +/- 0.10) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha -1.04 +/- 0.04
grb.spectrum.main.Band.xp (4.5 -0.6 +0.7) x 10^2 keV
grb.spectrum.main.Band.beta -2.40 -0.28 +0.27

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval7 -662.254158
n3_interval7 -640.963538
n4_interval7 -650.316976
total -1953.534671

Values of statistical measures:

statistical measures
AIC 3915.182657
BIC 3930.591475
DIC 3868.600822
PDIC 3.213232
log(Z) -842.228859
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.52 +/- 0.12) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.5 +/- 0.6) x 10^-1
grb.spectrum.main.Band.xp (3.9 +/- 0.5) x 10^2 keV
grb.spectrum.main.Band.beta -2.45 +/- 0.26

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval8 -702.365954
n3_interval8 -698.329052
n4_interval8 -666.247397
total -2066.942404

Values of statistical measures:

statistical measures
AIC 4141.998122
BIC 4157.406939
DIC 4098.172537
PDIC 3.271606
log(Z) -892.005224
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (1.4 +/- 0.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-8.3 +/- 2.1) x 10^-1
grb.spectrum.main.Band.xp (1.1 +/- 0.4) x 10^2 keV
grb.spectrum.main.Band.beta -1.90 -0.21 +0.18

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval9 -648.432725
n3_interval9 -617.203806
n4_interval9 -616.429981
total -1882.066512

Values of statistical measures:

statistical measures
AIC 3772.246339
BIC 3787.655157
DIC 3729.901419
PDIC -16.642676
log(Z) -816.109922
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (2.2 +/- 0.5) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-7.2 +/- 1.6) x 10^-1
grb.spectrum.main.Band.xp (2.3 +/- 0.6) x 10^2 keV
grb.spectrum.main.Band.beta -2.17 -0.31 +0.30

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval10 -460.794387
n3_interval10 -437.744346
n4_interval10 -433.158507
total -1331.697240

Values of statistical measures:

statistical measures
AIC 2671.507795
BIC 2686.916612
DIC 2633.121413
PDIC -0.837693
log(Z) -574.334597
  analysing data from chains/fit-.txt
Maximum a posteriori probability (MAP) point:

result unit
parameter
grb.spectrum.main.Band.K (3.5 -1.5 +1.6) x 10^-2 1 / (cm2 keV s)
grb.spectrum.main.Band.alpha (-4.3 -2.5 +2.6) x 10^-1
grb.spectrum.main.Band.xp (1.28 -0.29 +0.30) x 10^2 keV
grb.spectrum.main.Band.beta -2.21 -0.30 +0.32

Values of -log(posterior) at the minimum:

-log(posterior)
b0_interval11 -292.474140
n3_interval11 -272.452447
n4_interval11 -255.934386
total -820.860973

Values of statistical measures:

statistical measures
AIC 1649.835261
BIC 1665.244078
DIC 1616.577487
PDIC -0.527073
log(Z) -352.600190
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -791.01818821936604      +/-  0.18903297212189751
 Total Likelihood Evaluations:        16725
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1946.8610603329876      +/-  0.22232025610656200
 Total Likelihood Evaluations:        21666
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -907.86917635021894      +/-  0.19288783515012528
 Total Likelihood Evaluations:        21061
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -787.90327818026174      +/-  0.17500574946669500
 Total Likelihood Evaluations:        17037
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2271.6064892525083      +/-  0.20269873720191420
 Total Likelihood Evaluations:        20060
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1573.3200435357435      +/-  0.19283803044213646
 Total Likelihood Evaluations:        19759
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1773.9910951610032      +/-  0.20973916854074559
 Total Likelihood Evaluations:        20213
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1939.3036145880119      +/-  0.19196528692734277
 Total Likelihood Evaluations:        19956
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -2053.9179309545416      +/-  0.18792166278988984
 Total Likelihood Evaluations:        18410
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1879.1625413414595      +/-  0.14805618730096928
 Total Likelihood Evaluations:        12553
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -1322.4542823375070      +/-  0.16889020577048000
 Total Likelihood Evaluations:        14794
 Sampling finished. Exiting MultiNest
 *****************************************************
 MultiNest v3.10
 Copyright Farhan Feroz & Mike Hobson
 Release Jul 2015

 no. of live points =  500
 dimensionality =    4
 *****************************************************
 ln(ev)=  -811.89194170134022      +/-  0.14655079382070718
 Total Likelihood Evaluations:        12402
 Sampling finished. Exiting MultiNest

Examine the fits

Now we can look at the fits in count space to make sure they are ok.

[24]:
for a in analysis:
    a.restore_median_fit()
    _ = display_spectrum_model_counts(a, min_rate=[20, 20, 20], step=False)
../_images/notebooks_grb080916C_41_0.png
../_images/notebooks_grb080916C_41_1.png
../_images/notebooks_grb080916C_41_2.png
../_images/notebooks_grb080916C_41_3.png
../_images/notebooks_grb080916C_41_4.png
../_images/notebooks_grb080916C_41_5.png
../_images/notebooks_grb080916C_41_6.png
../_images/notebooks_grb080916C_41_7.png
../_images/notebooks_grb080916C_41_8.png
../_images/notebooks_grb080916C_41_9.png
../_images/notebooks_grb080916C_41_10.png
../_images/notebooks_grb080916C_41_11.png

Finally, we can plot the models together to see how the spectra evolve with time.

[25]:
fig = plot_spectra(
    *[a.results for a in analysis[::1]],
    flux_unit="erg2/(cm2 s keV)",
    fit_cmap="viridis",
    contour_cmap="viridis",
    contour_style_kwargs=dict(alpha=0.1),
)
../_images/notebooks_grb080916C_43_13.png

This example can serve as a template for performing analysis on GBM data. However, as 3ML provides an abstract interface and modular building blocks, similar analysis pipelines can be built for any time series data.