threeML.bayesian.autoemcee_sampler module

class threeML.bayesian.autoemcee_sampler.AutoEmceeSampler(likelihood_model=None, data_list=None, **kwargs)[source]

Bases: UnitCubeSampler

sample(quiet=False)[source]

sample using the UltraNest numerical integration method :rtype:

Returns:

setup(num_global_samples=10000, num_chains=4, num_walkers=None, max_ncalls=1000000, max_improvement_loops=4, num_initial_steps=100, min_autocorr_times=0)[source]

Sample until MCMC chains have converged.

The steps are:

  1. Draw num_global_samples from prior. The highest num_walkers points are selected.

  2. Set num_steps to num_initial_steps

  3. Run num_chains MCMC ensembles for num_steps steps

  4. For each walker chain, compute auto-correlation length (Convergence requires num_steps/autocorrelation length > min_autocorr_times)

  5. For each parameter, compute geweke convergence diagnostic (Convergence requires |z| < 2)

  6. For each ensemble, compute gelman-rubin rank convergence diagnostic (Convergence requires rhat<1.2)

  7. If converged, stop and return results.

  8. Increase num_steps by 10, and repeat from (3) up to max_improvement_loops times.

num_global_samples: int

Number of samples to draw from the prior to

num_chains: int

Number of independent ensembles to run. If running with MPI, this is set to the number of MPI processes.

num_walkers: int

Ensemble size. If None, max(100, 4 * dim) is used

max_ncalls: int

Maximum number of likelihood function evaluations

num_initial_steps: int

Number of sampler steps to take in first iteration

max_improvement_loops: int

Number of times MCMC should be re-attempted (see above)

min_autocorr_times: float

if positive, additionally require for convergence that the number of samples is larger than the min_autocorr_times times the autocorrelation length.