0

Suppose $x$ is a random vector in $\mathbb{R}^n$ which is distributed according to $D$.

What is the unbiased estimator of covariance matrix of an N-dimensional random variable?

When $y$ is a i.i.d. random variable and we have access to $(y_1,y_2,\cdots,y_n)$, the sample mean is an unbiased estimator of $\hat{\mu}=\frac{\sum_{i=1}^N}{N}$ and $\hat{\sigma}^2=\frac{1}{N-1}\sum_{i=1}^N(y_i-\hat{\mu})^2$ is an unbiased estimator of variance.

By going to higher dimension in addition to variance we have covariance between each element of the random vector. My question is

$$ \hat{C}=? $$ where $\hat{C}$ is an unbiased estimator of $C = \mathbb{E}[(x-\mu)(x-\mu)^T]$.

  • Essentially what you might expect, with $\hat{\vec{\mu}}= \frac1n \sum \vec{x}_i$ for the estimator of the mean vector and $\frac1{n-1} \sum (\vec{x}_i - \hat{\vec{\mu}})(\vec{x}_i - \hat{\vec{\mu}})^T$ for the estimator of the covariance matrix. See https://math.stackexchange.com/questions/2019122/unbiased-estimate-of-the-covariance – Henry Dec 26 '18 at 18:12
  • @Henry: That answer is for the $(X,Y)$ random vector, I need a proof in terms of vector. –  Dec 26 '18 at 19:33

0 Answers0