18

In general can one say that for a random variable X:

$E[\frac{1}{X}] = \frac{1}{E[X]}$ ?

I've worked out a few examples where this works but I'm not sure how widely this is useful...

  • 1
    In general, $1/E[X]$ is the inverse of an arithmetic mean, and $E[1/X]$ is the inverse of a harmonic mean. Arithmetic and harmonic means of the same set are highly unlikely to be equal to each other... – Micah Dec 01 '12 at 06:38
  • 3
    If $X$ is a positive random variable, then this equality holds if and only if $X$ is a constant (that is, $X=c$ almost surely). – Yury Dec 01 '12 at 06:41
  • 4
    I've worked out a few examples where this works... Really? Which ones? – Did Dec 01 '12 at 11:10
  • An example where this does hold is given here. Further discussion can be found here. – StubbornAtom May 10 '20 at 19:34

6 Answers6

25

It is very rarely true. Let's do a random example. Let $X$ be uniform on $[1,3]$. Then $E(X)=2$. But $$E\left(\frac{1}{X}\right)=\int_1^3 \frac{1}{x}\cdot\frac{1}{2}\,dx=\frac{\log 3}{2}\ne \frac{1}{2}.$$

For a simpler example, let $X=1$ with probability $1/2$, and let $X=3$ with probability $1/2$. Then $E(X)=2$.

But $E(1/X)=(1/2)(1)+(1/2)(1/3)=2/3$.

Daniel Fischer
  • 211,575
André Nicolas
  • 514,336
  • I am somewhat confused with this answer. Isn't $E(1/X) = \int_{1}^{3} 1/x \cdot f(1/x) d(1/x)$, where $f$ is the density function of the random variable $1/x$, and is this the same as the term given in the answer? – Florian Biermann Jan 10 '19 at 17:41
  • @Florian that would be incorrect since it is implicit here that the expectation is over the random variable $X$ not it’s inverse. The density would accordingly have be of $X$. – ijuneja Jan 10 '21 at 06:49
18

Jensen's inequality for functions of RVs is $\mathbf{E} \varphi(x) \geq \varphi(\mathbf{E}X)$ when $\varphi$ is convex and $\mathbf{E} \varphi(x) \leq \varphi(\mathbf{E}X)$ for concave functions. Clearly $Y = \frac{1}{X}$ is a convex function when restricted to the positive reals or the negative reals, so ${\bf E}[X^{-1}]\ge {\bf E}[X]^{-1}$ is true as long as $X$ is almost surely positive or almost surely negative.

Mike Earnest
  • 84,902
Alex
  • 19,395
  • 2
    There is a "for concave functions" missing. – Michael Greinecker Jul 02 '13 at 21:39
  • 1
    The function $x\mapsto 1/x$ is only convex on the domains $(0,+\infty)$ or $(-\infty,0)$. Therefore, the inequality $E[1/X]\ge 1/E[X]$ is only valid if $P(X> 0)=1$ or $P(X<0)=1$. For example, if $P(X=1)=2/3$ and $P(X=-1)=1/3$, then $E[1/X]=1/2$, while $1/E[X]=2$. – Mike Earnest Jan 17 '22 at 18:42
7

For such a case, it is a good idea to study Jensen's inequality.

Another counterexample to the one given by André Nicolas is this one. Consider $X$ to be a normal distribution with mean $\mu$ and variance one. Then $E[X]=\mu$ but not only is $E[\frac{1}{X}]$ not in general equal to $1/\mu$; rather, it does not exist.

Learner
  • 7,538
  • How do I see that it doesn't exist? Could you explain that? Thank you! – kelu Jun 19 '13 at 09:40
  • I think you'd need to compute the inner product of f(x) and 1/x (as per wiki), which in this case (Wolfram Alpha notation ignoring constants, m=1: Integrate[exp(-(x-1)^2)/x,{x,-inf,inf}] ) just doesn't converge. Why? Well, I guess the area under the curve is just infinite. I remember Infinite Acres from calc, but beyond that you'd have to do some tests to really check. – alexey Mar 11 '16 at 18:53
0

Take any continuous random variable $X$ with density $f_X$ and support on an open interval containing the origin. If $f_X(0)>0$, then $\mathsf EX^{-1}$ does not exist; however, there are many density functions that satisfy these properties and $\mathsf EX$ is finite and nonzero, e.g. $X\sim\mathcal N(1,1)$. Hence, $\mathsf EX^{-1}\neq (\mathsf EX)^{-1}$ under these conditions.

-2

In general if, X>0 , then the following inequality always will be satisfied:

E(1/X)>= 1/E(X)
Meisam
  • 11
-4

Y = 1/X is a convex function when x > 0