Nested sampling (vanilla and SMC).
This module is less tested than the rest of the package.
Moreover, this documentation does not explain precisely how nested sampling works (and this topic is not covered in our book). Thus, refer to e.g. the original papers of Skilling or Chopin and Robert (2010, Biometrika). For nested sampling SMC, see the paper of Salomone et al (2018).
Vanilla nested sampling¶
This module contains classes that implement nested sampling:
NestedSampling: base class;
Nested_RWmoves: nested sampling algorithm based on random walk Metropolis steps.
To use the latter, you need to define first a static model, in the same way as
smc_samplers module. For instance:
import particles from particles import smc_samplers as ssp from particles import distributions as dists class ToyModel(ssp.smc_samplers): def logpyt(self, theta, t): # log-likelihood of data-point at time t return stats.norm.logpdf(self.data[t], loc=theta) my_prior = dists.Normal() # theta ~ N(0, 1) y = random.normal(size=1) # y | theta ~ N(theta, 1) toy_model = ToyModel(data=y, prior=my_prior)
Then, the algorithm may be set and run as follows:
from particles import nested algo = nested.Nested_RWmoves(model=toy_model, N=1000, nsteps=3, eps=1e-8) algo.run() print('estimate of log-evidence: %f' % algo.lZhats[-1])
This will run a nested sampling algorithm which propagates N=1000 particles;
each time a point is deleted, it is replaced by another point obtained as
follows: another point is selected at random, and then moved through
steps of a Gaussian random walk Metropolis kernel (which leaves invariant the
prior constrained to the current likelihood contour). To make these steps
reasonably efficient, the covariance matrix of the random walk proposal is
dynamically adapted to the current sample of points. The algorithm is stopped
when the different between the two most recent estimates of the log-evidence
To implement your own algorithm, you must sub-class
from particles import nested class MyNestedSampler(nested.NestedSampling): def mutate(self, n, m): # implement a MCMC step that replace point X[n] with the point # obtained by starting at X[m] and doing a certain number of steps return value
Nested sampling Sequential Monte Carlo¶
Salomone et al (2018) proposed a SMC sampler inspired by nested sampling. The
target distribution at time t is the prior constrained to the likelihood being
larger than constant l_t. These constants may be chosen adaptively: in this
implementation, the next l_t is set to the
ESSrmin upper-quantile of the
likelihood of the current points (where
ESSrmin is specified by the user).
In other words, l_t is chosen so that the ESS equals this value.
ESSrmin corresponds to 1 - rho in Salomone et al’s notations.)
This module implements this SMC sampler as
NestedSamplingSMC, a sub-class of
smc_samplers.FKSMCsampler, which may be used the same way as other SMC
samplers defined in module
fk = nested.NestedSamplingSMC(model=toy_model, wastefree=True, ESSrmin=0.3) alg = particles.SMC(fk=fk, N=1_000) alg.run()
Upon completion, the dictionary
alg.X.shared will contain the successive
estimates of the log-evidence (log of marginal likelihood, in practice the
final one is the one you want to use), and the successive values of l_t.
Note that a waste-free version of NS-SMC is run by default, but the original
paper of Salomone et al (which predates NS-SMC) only considers a standard
version. (For more details on waste-free SMC vs standard SMC, see module
smc_samplers and the corresponding jupyter notebook.)
Salomone, South L., Drovandi C. and Kroese D. (2018). Unbiased and Consistent Nested Sampling via Sequential Monte Carlo, arxiv 1805.03924.
Sample uniformly from 0, ..., N-1, minus m.
Tracks mean and cov of a set of points.
Base class for nested sampling algorithms.
Feynman-Kac class for the nested sampling SMC algorithm.
Nested sampling with (adaptive) random walk Metropolis moves.