0

Suppose I have the following OLS regression:

$$Y_ i = \beta_0 + X_i'\delta + e_i$$

where $X_i$ is a random vector that does not contain an intercept and $Y_i$ is a scalar.

What should this give me?

$$Z = \hat{\beta_0} + \bar{X_N}\hat{\delta}$$

How I've been thinking about it:

Let $W_i = (1 X_i)'$, then Z is:

$$(\frac{1}{N}\sum{W_i})'(\frac{1}{N}\sum{W_iW_i'})^{-1}(\frac{1}{N}\sum{W_iY_i})$$

but this doesn't seem to lead me to anything.

Rainroad
  • 686

1 Answers1

1

Let $X=[{\bf 1} \,X_1\, \cdots\, X_n]^T$, $Z=[e_1\, e_2\, \cdots \,e_n]^T$, $\beta=[\beta_0\, \delta^T]^T$, and $Z=[Y_1\, Y_2\, \cdots \,Y_n]^T$ and $$Y_ i = \beta_0 + X_i'\delta + e_i\tag{1}.$$

You can write $(1)$ as $$Z= X\beta+e\tag{2},$$ or as $$Z-X\beta=e\tag{3}.$$

Now you can find $\beta$ such that the norm of the residual vector $e$ is minimum. This is equivalent to solve the minimizing problem $$\min_{\beta}\|Z-X\beta\|^2=\min_{\beta}g(\beta),\qquad g(b)=\|Z-X\beta\|^2.$$

This gives you the equation $$\nabla g(\beta)=0,$$ or $$X^T(Z-X\beta)=0.$$

Therefore $$\hat{\beta}=X^{+}X^TZ,$$ in which $X^+=(X^TX)^{-1}$, when it exists, or $X^+$ is the generalized Moore-Penrose pseudoinverse of $X^TX$.

Now you can use the notation $$\hat{Z}=X\hat{\beta}$$ and $\hat{e}=Z-X\hat{\beta}.$

Please see Does centering the dependent variable and every independent variable change the estimated regressor?

You can find related results searching for ''\(Y = X\beta+e\) least square '' on SearchOnMath, like the Wikipedia page on Ordinary least squares.