Convergence of Markov Chains in Information Divergence

Abstract

Information theoretic methods are used to prove convergence in information divergence of reversible Markov chains. Also some ergodic theorems for information divergence are proved.

Publication
Theoretical Probability 22 (1), 2009, pp. 186–202