Theoretical Economics 18 (2023), 1585–1622
Tweet
Pathwise concentration bounds for Bayesian beliefs
Drew Fudenberg, Giacomo Lanzani, Philipp Strack
Abstract
We show that Bayesian posteriors concentrate on the outcome distributions that approximately minimize the Kullback-Leibler divergence from the empirical distribution, uniformly over sample paths, even when the prior does not have full support. This generalizes Diaconis and Freedman (1990)'s uniform convergence result to e.g., priors that have finite support, are constrained by independence assumptions, or have a parametric form that cannot match some probability distributions. The concentration result lets us provide a rate of convergence for Berk (1996)’s result on the limiting behavior of posterior beliefs when the prior is misspecified. We provide a bound on approximation errors in “anticipated-utility” models, and extend our analysis to outcomes that are perceived to follow a Markov proces
Keywords: Learning, misspecification, Bayesian updating, consistency, concentration of beliefs
JEL classification: D83, D90
Full Text: PRINT VIEW