Suppose we have model $Y = X\beta$ and first collumn of $X$ consists of $1$. Why the sum of residuals in the this regression model equals $0$? And why in generall it doesn't, when there is no constant term?
Asked
Active
Viewed 1,795 times
1
-
Does this answer your question? Why the sum of residuals equals 0 when we do a sample regression by OLS? – grand_chat Mar 08 '23 at 20:17
1 Answers
1
Consider the linear model of a form $Y=\beta_0 + \sum_{j=1}^p\beta_jx_j + \epsilon $, so to find the OLS estimator you construct $$ \min_{\beta \in \mathcal B}\sum_{i=1}^n(Y_i-\beta_0 + \sum_{j=1}^p\beta_jx_j)^2, $$ then when you take the derivative w.r.t the intercept term $\beta_0$, you get the following expression $$ -2\sum_{i=1}^n(Y_i-\hat{\beta_0} + \sum_{j=1}^p\hat{\beta}_jx_j) = 2\sum_{i=1}^ne_i=0. $$ Or, using matrix notations, you get the normal equations that your $\beta$ solves, i.e., $$ X'X\hat{\beta}-X'y=0, $$ namely, $$ X'(X\hat\beta - y) = X'e=0, $$ where the first row is $1^Te =\sum_{i=1}^ne_i $.
V. Vancak
- 16,927