# Remark on the asymptotics of the likelihood ratio and the K.-L. divergence

## The problem.

Let $f, g, h$ be three densities and suppose that, $x_i \sim h$, $i \in \mathbb{N}$, independently. What happens to the likelihood ratio

$\prod_{i=1}^n \frac{f(x_i)}{g(x_i)}$

as $n\rightarrow \infty$?

Clearly, it depends. If $h = g \not = f$, then

$\prod_{i=1}^n \frac{f(x_i)}{g(x_i)} \rightarrow 0$

almost surely at an exponential rate. More generally, if $h$ is closer to $g$ than to $f$, in some sense, we’d expect that $\prod_{i=1}^n \frac{f(x_i)}{g(x_i)} \rightarrow 0$. Such a measure of “closeness” of “divergence” between probability distributions is given by the Kullback-Leibler divergence

$D_{KL}(f, g) = \int f \log\frac{f}{g}.$

It can be verified that $D_{KL}(f,g) \geq 0$ with equality if and only if $f=g$, and that

$D_{KL}(h,g) < D_{KL}(h,f) \Longrightarrow \prod_{i=1}^n \frac{f(x_i)}{g(x_i)} \rightarrow 0 \qquad (1)$

almost surely at an exponential rate. Thus the K.L.-divergence can be used to solve our problem.

## Better measures of divergence?

There are other measures of divergence that can determine the asymptotic behavior of the likelihood ratio as in $(1)$ (e.g. the discrete distance). However, in this note, I give conditions under which the K.-L. divergence is, up to topological equivalence, the “best” measure of divergence.Read More »

# Sudden jumps and almost surely slow convergence

Estimating a mean can be an arduous task. Observe, for instance, the “sudden jumps” in trajectories of the Monte Carlo estimate $\overline X_n$ of $\mathbb{E}X$, where $\mathbb{E}X^2 = \infty$.

Proposition.
Let $X_1, X_2, X_3, \dots$ be a sequence of i.i.d. and integrable random variables, and let $\overline X_k = \frac{1}{k}\sum_{i=1}^k X_i$. For all $M > 0$ we have that

$P\left( \left| \overline X_{n} - \overline X_{n-1}\right| > Mn^{-1/2} \;\; \text{ i.o.} \right)$

is $1$ or $0$ according to $\text{Var}(X_n) =\infty$ or $\text{Var}(X_n) < \infty$.

Corollary.
If $\text{Var}(X_k) = \infty$, then $\overline X_n$ converges towards $\mathbb{E}X_k$ at a rate almost surely slower than $\frac{1}{\sqrt{n}}$ :

$\left|\overline X_n - \mathbb{E}X_n \right| \not = \mathcal{O}\left(n^{-1/2}\right) \quad \text{a.s.}$

PDF proof.

Many thanks to Xi’an who wondered about the occurence of these jumps.

# Fair unfavorable games

Fair gambling games should be… fair, right?!

Félix Locas, here at UQAM, introduced me to “fair but unfavorable” gambling games of the following type:

• you pay $\mu$ to draw a random positive gain $X$ such that $\mathbb{E}X = \mu$;
• you can play for as long as you want.

Precisely, we have a sequence $X_1, X_2, X_3, \dots$ of independent and identically distributed random variables representing your gain (minus $\mu$) at the $k$th game. Thus $X_k \in [-\mu, \infty)$ and $\mathbb{E}X_k = 0$.

The surprising fact is that there exists such “fair” gambling games for which the total gain $S_n = \sum_{i=1}^n X_i$ tends to $-\infty$ in probability. More precisely (and even worse), there are games for which

$P\left( S_n < -\frac{(1-\varepsilon)n}{(\log n)^{a} }\right) \rightarrow 1$

for all small $a > 0$. (See W. Feller, Note on the law of large number and “fair” games, 1945.)Read More »