I do a multiple regression with 3 independent variables $X_1$, $X_2$ and $X_3$. The correlation between $Y$ and $X_1$, $Y$ and $X_2$, and $Y$ and $X_3$, are each large and statistically significant. When I fit the multiple regression, none of the t-tests on individual coefficients are significant. How could this happen?
Asked
Active
Viewed 78 times
6
-
2More likely the 'opposite': $X_1, X_2,$ and $X_3$ contain essentially the same information (roughly co-linearity) and none stands out above the others in the multiple regression. I suppose the overall model in the latter case is significant. Insufficient information to answer for sure. – BruceET Sep 12 '15 at 04:55
-
3This surely suggests multicollinearity problem exists. – Seow Fan Chong Sep 12 '15 at 05:00
-
Can you post the full regression output? – MerylStreep Sep 13 '15 at 19:12
-
@MerylStreep. This is an question from qualifying exam. And the above is the whole question. – Mike Brown Sep 13 '15 at 19:20
-
1Then I agree with @Seow Fan Chong. This looks like a question on imperfect multicollinearity -- what you are describing is possible if variables are highly intercorrelated. – MerylStreep Sep 13 '15 at 19:23
-
Your title of the question is a bit misleading -- the test for the overall significance of the regression is the F-test but the problem itself says nothing about the values of the F-statistic. – MerylStreep Sep 13 '15 at 19:24
-
1I also agree with @SeowFanChong and MertkStreep....collinearity inflates the standard errors of your coefficients, so none seem significantly different from zero. – Sep 14 '15 at 03:55