2

This calculation of mean of a random variable using the CDF has the integral evaluated using integration by parts and the first term is:

$$[x(1-F(x)) ]_{0}^{\infty} =0$$

This question has been answered in full generality using the tools of the Lebesgue-Stieltjes integral (which I do not know). Is there a proof (that might only work for continuous distributions), but which can be done with introductory calculus? (Or does this limit require advanced methods only).

My Approach so far: I evalauated this getting $\infty \times 0$ case: $$=\lim_{x\to\infty} x(1-F(x))$$

I converted into the $\frac{0}{0}$ case, and applied L'Hopital's Rule , but I again got $\infty \times 0$: $$\lim_{x\to\infty} \frac{1-F(x)}{\frac{1}{x}}=\lim_{x\to\infty} \frac{-f(x)}{\frac{-1}{x^2}}=\lim_{x\to\infty} x^2f(x)$$

Starlight
  • 2,684
  • $\int_x^{\infty} yf(y)dy\ge x \int_x^{\infty} f(y)dy=x(1-F(x))$ and $\int_x^{\infty} yf(y)dy \to 0$ if the mean is finite. – Kavi Rama Murthy May 10 '24 at 08:09
  • Why is the first integral greater than the second? And why does the second integral tend to zero? – Starlight May 10 '24 at 08:28
  • Where can I get a systematic discussion of these properties, and their proofs and applications? – Starlight May 10 '24 at 09:00
  • Please see the new part added to the answer. I found a question in CV older than the MSE question that I mentioned in my answer. The first answer of this CV question can help you as it only uses elementary calculus. Don't hesitate to ask if you have any questions. – Amir May 10 '24 at 22:58

1 Answers1

2

For $X$ with finite mean $\mu$ and variance $\sigma^2$, which is the case in your problem, from the Cantelli's inequality, for $x>\mu$, we have

$$ 1-F_X(x)=\mathbb P(X> x) =\mathbb P(X-\mu> x-\mu)\le \frac{\sigma^2}{\sigma^2+(x-\mu)^2}. $$

Hence, for $x > \max(0,\mu)$

$$ 0\le x(1-F_X(x)) \le \frac{x\sigma^2}{\sigma^2+(x-\mu)^2}, $$

which gives the desired result by the sandwich theorem.

If the mean is not finite this cannot hold. For example, consider the following cdf

$$F_X(x)=1-\frac1x, \, x\ge 1$$

for which $\lim_{x\to\infty}x(1-F_X(x))=1$.

If the mean is finite, but variance is not finite, the above can hold. For example, consider the following cdf

$$F_X(x)=1-\frac{1}{x^{1.5}}, \, x\ge 1$$

for which $\lim_{x\to\infty}x(1-F_X(x))=0$.

In fact, using a more advanced method used here, it can be proven that

$\lim_{x\to\infty}x(1-F_X(x))=0$ if and only if $\mathbb E(X)$ is finite.


PS: Consider

$$\mathbb P \left\{\ X > x\ \right\} \sim x^{- \alpha} \quad$$ as $x \to \infty$ for $\alpha > 0$.

When $\alpha<2$, the distribution is said to have a fat tail, for which the variance is undefined (a special property of the power-law distribution), but is has finite mean for $\alpha>1$. For $\alpha\le 1$, you can see that $\lim_{x\to\infty}x(1-F_X(x))\ge 1$.


Update

I just found an older question in Cross Validated limit of $x \left[1-F(x) \right]$ as $x \to \infty$. See the first answer for this question. It is similar to the method suggested by @geetha290krm and only uses elementary calculus.

Note that if you replace $f(x)\text{d}x$ with $\text{d}F(x)$ in the integrals appeared in the first answer, you will get the answer given by @MikeEarnest in this post (Is it true that $\lim\limits_{x\to\infty}{x·P[X>x]}=0$?), no need to know a lot about Lebesgue-Stieltjes integral.

Amir
  • 11,124