In general can one say that for a random variable X:
$E[\frac{1}{X}] = \frac{1}{E[X]}$ ?
I've worked out a few examples where this works but I'm not sure how widely this is useful...
In general can one say that for a random variable X:
$E[\frac{1}{X}] = \frac{1}{E[X]}$ ?
I've worked out a few examples where this works but I'm not sure how widely this is useful...
It is very rarely true. Let's do a random example. Let $X$ be uniform on $[1,3]$. Then $E(X)=2$. But $$E\left(\frac{1}{X}\right)=\int_1^3 \frac{1}{x}\cdot\frac{1}{2}\,dx=\frac{\log 3}{2}\ne \frac{1}{2}.$$
For a simpler example, let $X=1$ with probability $1/2$, and let $X=3$ with probability $1/2$. Then $E(X)=2$.
But $E(1/X)=(1/2)(1)+(1/2)(1/3)=2/3$.
Jensen's inequality for functions of RVs is $\mathbf{E} \varphi(x) \geq \varphi(\mathbf{E}X)$ when $\varphi$ is convex and $\mathbf{E} \varphi(x) \leq \varphi(\mathbf{E}X)$ for concave functions. Clearly $Y = \frac{1}{X}$ is a convex function when restricted to the positive reals or the negative reals, so ${\bf E}[X^{-1}]\ge {\bf E}[X]^{-1}$ is true as long as $X$ is almost surely positive or almost surely negative.
For such a case, it is a good idea to study Jensen's inequality.
Another counterexample to the one given by André Nicolas is this one. Consider $X$ to be a normal distribution with mean $\mu$ and variance one. Then $E[X]=\mu$ but not only is $E[\frac{1}{X}]$ not in general equal to $1/\mu$; rather, it does not exist.
Take any continuous random variable $X$ with density $f_X$ and support on an open interval containing the origin. If $f_X(0)>0$, then $\mathsf EX^{-1}$ does not exist; however, there are many density functions that satisfy these properties and $\mathsf EX$ is finite and nonzero, e.g. $X\sim\mathcal N(1,1)$. Hence, $\mathsf EX^{-1}\neq (\mathsf EX)^{-1}$ under these conditions.
In general if, X>0 , then the following inequality always will be satisfied:
E(1/X)>= 1/E(X)
Y = 1/X is a convex function when x > 0