If one has a random variable $X$, described by a finite probability distribution with equally likely possible values $x_1, \ldots, x_n$, then the inequality: $E[\log X] \leq \log(E[X])$ is a reformulation of the arithmetic/geometric mean inequality.
Does anyone have a reasonable reference for the general inequality $E[\log X] \leq \log(E[X])$ for a general random variable? And a necessary and sufficient condition for equality (maybe only if the distribution is concentrated at one point, for example)?