Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.
Skip to main content

© 2019 Biometrika Trust. In this paper we revisit the weighted likelihood bootstrap, a method that generates samples from an approximate Bayesian posterior of a parametric model. We show that the same method can be derived, without approximation, under a Bayesian nonparametric model with the parameter of interest defined through minimizing an expected negative loglikelihood under an unknown sampling distribution. This interpretation enables us to extend the weighted likelihood bootstrap to posterior sampling for parameters minimizing an expected loss.We call this method the losslikelihood bootstrap, and we make a connection between it and general Bayesian updating, which is a way of updating prior belief distributions that does not need the construction of a global probability model, yet requires the calibration of two forms of loss function. The losslikelihood bootstrap is used to calibrate the general Bayesian posterior by matching asymptotic Fisher information.We demonstrate the proposed method on a number of examples.

Original publication

DOI

10.1093/biomet/asz006

Type

Journal article

Journal

Biometrika

Publication Date

01/01/2019

Volume

106

Pages

465 - 478