0

If one has a random variable $X$, described by a finite probability distribution with equally likely possible values $x_1, \ldots, x_n$, then the inequality: $E[\log X] \leq \log(E[X])$ is a reformulation of the arithmetic/geometric mean inequality.

Does anyone have a reasonable reference for the general inequality $E[\log X] \leq \log(E[X])$ for a general random variable? And a necessary and sufficient condition for equality (maybe only if the distribution is concentrated at one point, for example)?

dkrashen
  • 141

1 Answers1

1

This is probably not of use to the OP anymore, but in the interest of the community, I am converting my comment into an answer.

Let $X:\Omega \to \mathbb{R}$ be a random variable on any probability space $(\Omega,\mathcal{F},\mu)$ such that $X > 0$ almost surely. Then, we have the inequality $E[\log X] \leqslant \log E[X]$ with equality if and only if $X$ is constant almost surely.

This is a special case of Jensen's inequality, $E[\varphi(X)] \geqslant \varphi(E[X])$ which holds for convex functions $\DeclareMathOperator{\supp}{supp}\varphi:\supp(X) \to \mathbb{R}$, where $\supp(X)$ is the smallest interval containing the essential support of $X$, i.e., the support of the pushforward measure induced on $\mathbb{R}$ by the random variable $X$, by setting $\phi(x) = -\log x$ for $x\in (0,\infty)$.

The function $\phi(x) = -\log x$ is strictly convex, and so by the equality conditions for Jensen's inequality, it follows that equality holds if and only if $X$ is constant almost surely.

asahay
  • 288