6

Consider a random variable $X$ with mean zero ($\mu_X = 0$), known variance ($\sigma_X^2$), and all other moments finite but unknown. I am interested in obtaining an estimate of the expected value of the positive part of this random variable, i.e., given $X^{+} \equiv \max(0, X)$ I want $\mathbb{E}(X^{+})$. Preferably this would only be a function of the variance as I have no other information, but this may not be possible.

It is simple to apply a standard Taylor series approach to this problem, e.g. if $f$ is the positive part function:

$$\mathbb{E}\left[f(X)\right]\approx f(\mu _{X})+{\frac {f''(\mu _{X})}{2}}\sigma _{X}^{2}$$

However, as $\mu_X = 0$, we need to find $f''(\mu_X)$, which is undefined. It is easy to make a function which converges to $X^+$ in some limit and has defined $f''(\mu_X)$, but this behavior is not unique, so I don't expect the behavior this function has to also apply to $X^+$.

It is not difficult to show that $\mathbb{E}(X^{+}) < \sigma_X$, but I'd prefer to know something like $\mathbb{E}(X^{+}) \approx \alpha \, \sigma_X$, where $\alpha$ is a constant to be determined. (Thanks to stud_iisc for noting that the inequality is strict.)

If it is necessary to assume that $X$ is Gaussian to get a result, that may be acceptable, though $X$ may not be Gaussian.

  • 1
    Just to add on to https://math.stackexchange.com/questions/1160095/convexity-and-equality-in-jensen-inequality: The inequality is strict since $x^2$ is strictly convex. Thus $0 \leq \alpha <1$. – rookie Jul 19 '17 at 13:50
  • @stud_iisc: Very interesting that you can show $\mathbb{E}(X^{+}) < \sigma_X$ rather than $\leq$. Thanks. Also, I suspect that $\alpha$ could be larger than $1$ if there are additional terms in the series expansion of $\mathbb{E}(X^{+})$, so the inequality only applies for the expectation, not $\alpha$. Correct me if I'm wrong. –  Jul 19 '17 at 14:00
  • 2
    If $X\sim N(0,\sigma^2)$ then $E(X^+)=\int_{0}^{\infty} x\cdot \frac{1}{\sqrt{2 \pi \sigma^2}} e^{-\frac{x^2}{2\sigma^2}}dx = \sqrt{\frac{2}{\pi}} \sigma$. So $\alpha=\sqrt{\frac{2}{\pi}} $. – rookie Jul 19 '17 at 14:06
  • I didn't get your comment. If $E(X^+) < \sigma_X$ then how can $\alpha$ be $\geq 1$? – rookie Jul 19 '17 at 14:10
  • 1
    There was a typo. It's actually $\alpha=\sqrt{\frac{1}{2\pi}} $. – rookie Jul 19 '17 at 14:13
  • The computation for a normal random variable was trivial. Not sure why I didn't do that. Thanks. My thinking about $\alpha$ is that we could make a series expansion like $\mathbb{E}(X^+) = \alpha , \sigma_X + g$ where $g$ is a collection of additional terms. So $\alpha \sigma_X + g < \sigma_X$. Perhaps $g$ is proportional to $\sigma_X$ so ultimately a different coefficient (say $\alpha'$) is less than 1, but I don't know how to show that. –  Jul 19 '17 at 14:15
  • 1
    If you post the normal random variable proof as an answer, I'll accept it if there are no other answers after a few days. –  Jul 19 '17 at 14:20

1 Answers1

3

Special case 1: If $X\sim N(0,\sigma^2)$ then $$E(X^+)=\int_{0}^{\infty} x\cdot \frac{1}{\sqrt{2 \pi \sigma^2}} e^{-\frac{x^2}{2\sigma^2}}dx = \sqrt{\frac{1}{2\pi}} \sigma.$$

Special case 2: If $X$ is a non-negative random variable then $\alpha = 0.$

rookie
  • 1,748
  • 1
    @satishramanathan What you have evaluated is $E(X|X>0)$. But $E(X^+) = E(X {\bf{1}}_{X>0})$. Just using the substitution $t=x^2$ to solve that integral gives us the required quantity. Correct me if I'm wrong. – rookie Jul 19 '17 at 15:36
  • 1
    You are right, it is not truncated normal distribution, sorry, I voted it up to salvage my misunderstanding, No worries – Satish Ramanathan Jul 19 '17 at 15:47
  • Just to be clear, what's the precise difference between $E(X|X>0)$ and $E(X^+)$? How does the integral differ for a normal random variable? –  Jul 20 '17 at 01:47
  • 1
    @BenTrettel The former is a conditional expectation (https://en.wikipedia.org/wiki/Conditional_expectation) and the latter is the expectation of the rv $X^+$. Let $f$ be the density of $X$. Then the density of $X$, conditional on it being positive, is $f(x)/P(X>0)$ if $x>0$, and $0$ otherwise; thus $E(X|X>0) = \int_{x>0} x\frac{f(x)}{P(X>0)}dx.$ – rookie Jul 20 '17 at 09:32
  • 1
    @Ben Trettel, an application of when you use that E(X/X>0) is given in this liink. I draws parallel to yours and hence the confusion. The second special case of stud_iisc will tell you when to use it. https://math.stackexchange.com/questions/1478333/question-about-creating-2-times-2-covariance-matrix-with-call-option/1478850#1478850 – Satish Ramanathan Jul 20 '17 at 11:44
  • Thanks for the comments. This has helped me understand what is relevant to the physical problem I'm studying. If I understand it correctly, $E(X^+)$ differs in that it puts half the weight on $0$. The conditional expectation is probably actually what I want to use, though I'll have to think more about it. I appreciate the help. –  Jul 20 '17 at 12:45