Let be three densities and suppose that, , , independently. What happens to the likelihood ratio
Clearly, it depends. If , then
almost surely at an exponential rate. More generally, if is closer to than to , in some sense, we’d expect that . Such a measure of “closeness” of “divergence” between probability distributions is given by the Kullback-Leibler divergence
It can be verified that with equality if and only if , and that
almost surely at an exponential rate. Thus the K.L.-divergence can be used to solve our problem.
Better measures of divergence?
There are other measures of divergence that can determine the asymptotic behavior of the likelihood ratio as in (e.g. the discrete distance). However, in this note, I give conditions under which the K.-L. divergence is, up to topological equivalence, the “best” measure of divergence.Read More »