2

Proposition:

Consider a linear regression model $Y_n=X_\pi \beta + \varepsilon_n$. $X_\pi$ is $n\times p_n$ matrix, and the errors

$$\varepsilon _n=\left ( \varepsilon _{n1},...,\varepsilon _{nn} \right )'$$ consists of $n$ independent and identically distributed variables, $\varepsilon _{ni} \sim N(0,1)$, for $i=1,...,n$. Consider $M_\pi =X_\pi\left ( X_\pi'X_\pi \right )^{-1}X_\pi'$. Then, $$\varepsilon _n' M_\pi \varepsilon _n \sim \chi ^2\left ( p_n \right ),$$

where $M_\pi$ is idempotent and symmetric matrix.

How can I prove $\varepsilon _n' M_\pi \varepsilon _n $ follows chi-squared distribution??

grand_chat
  • 40,909

1 Answers1

2

Lemma: If $A$ is a symmetric and idempotent $n\times n$ real matrix, then $A=UU^T$ where $U$ is an $n\times r$ matrix with orthonormal columns, $r$ being the rank of $A$.

Proof: Since matrix $A$ is idempotent, its eigenvalues are zero and one, and the multiplicity of unit eigenvalues equals the rank $r$ of $A$. Apply the spectral theorem for symmetric matrices to write $A=UDU^T$ where $D$ is a diagonal matrix of the eigenvalues of $A$ and $U$ is an $n\times n$ orthogonal matrix whose columns are the corresponding eigenvalues. Delete from $U$ the columns corresponding to zero eigenvalue, leaving an $n\times r$ matrix; note that $D$ then becomes the identity.


To prove the result, let's suppress subscripts. Assume $M$ is idempotent and symmetric. Use the lemma to find an $n\times r$ matrix $U$ with orthonormal columns such that $M=UU^T$ and $r$ is the rank of $M$. Write $N:=U^T\varepsilon$. Then $N$ has a multivariate normal distribution with mean vector $0$ and covariance matrix $$ \operatorname{Var}(N)=E(U^T\varepsilon)(U^T\varepsilon)^T= U^TE(\varepsilon\varepsilon^T)U=I_{r\times r}. $$ To finish, observe that $$ \varepsilon^T M\varepsilon = \varepsilon^TUU^T\varepsilon=N^TN $$ is the sum of squares of $r$ IID standard normal variables, and therefore has chi-squared($r$) distribution.

The lemma asserts that $r$ is the rank of $M$. Since $M$ is idempotent, the rank $r$ equals the trace. In the case $M=X(X^TX)^{-1}X^T$ this equals $\operatorname{tr}(M)=\operatorname{tr}[(X^TX)^{-1}(X^TX)]=\operatorname{tr}(I_{p})=p$.

grand_chat
  • 40,909