4

It is well known that a median of a distribution $\mu$ can be defined as an $m$ such that $$m\in\operatorname*{arg\,min}_{c\in\mathbb{R}}\mathbb{E}_{X\sim\mu}[|X-c|].$$

Similarly, the mean of a distribution $\mu$ is defined as an $m$ such that $$m=\operatorname*{arg\,min}_{c\in\mathbb{R}}\mathbb{E}_{X\sim\mu}[(X-c)^2].$$

I am interested in whether there is any reference or literature on the generalization of this to higher powers $p$ - in particular, what can be said about $m$ such that $$m\in\operatorname*{arg\,min}_{c\in\mathbb{R}}\mathbb{E}_{X\sim\mu}[|X-c|^p]$$ for $p>2?$

Edit: Coming back to this I believe the last line can be written as $$m=\operatorname*{arg\,min}_{c\in\mathbb{R}}\mathbb{E}_{X\sim\mu}[|X-c|^p],$$ that is there is a unique minimizer (this is because for $p\in (1,\infty)$, $|\cdot|^p$ is strictly convex).

Tyler6
  • 1,307
  • 1
    Note this is not the pth moments, this is distinct. In particular the 1st moment is the mean, while here $p=1$ is the median and $p=2$ recovers the mean, not the second moment. – Tyler6 Nov 01 '22 at 03:39
  • In a discrete situation (i.e. not a distribution but a finite set of $x_i$), we would have: $c$ such that $\sum_{x_i < c} (c-x_i)^p = \sum_{x_i > c} (x_i-c)^p$ is $\operatorname*{arg,min}{c\in\mathbb{R}} \sum{x_i} |x_i-c|^{p+1}$. (This works also for the median, with $p=0$, replacing "is" by "$\in$", as you noted). This can be proven by finding the minimum as the point where the derivative is null (hence $p+1$ exponent becomes $p$). So the result for distributions shoud be similar. – Jean-Armand Moroni Nov 12 '22 at 18:48

1 Answers1

2

I am interested in whether there is any reference or literature on the generalization of this to higher powers $p$

There's a "short communication" that defines the quantity you call $m$ as the "location parameter $L_p(\mathcal{D})$". It doesn't have all that many references, which seems to suggest that the authors came up with the idea on their own, rather than finding it in prior work:

Callegaro, L., & Pennecchi, F. (2007). Why always seek the expected value? A discussion relating to the Lp norm. Metrologia, 44(6), L68.

what can be said about $m$ for $p>2$?

The same two authors have an earlier paper in which they discuss estimators for $m$. This topic seems more widely studied and the references within this paper may worth looking at:

Pennecchi, F., & Callegaro, L. (2006). Between the mean and the median: the Lp estimator. Metrologia, 43(3), 213.

Most of the focus is on $1<p<2$, but there are some results for $p=\infty$ and generic $p$. For instance, the authors show how estimators of $m$ vary as a function of $p$ and the probability distribution. In particular, higher values of $p$ will give smaller estimator values for the uniform distribution. They also mention some asymptotic properties of estimators for $m$ (see Eqn 13 and surrounding text).

hargow
  • 151