In multiple regression, what is a likely consequence of multicollinearity?

Prepare for the Barnard Statistics Concepts Test. Utilize flashcards and multiple-choice questions with explanations. Accelerate your stats knowledge!

Multiple Choice

In multiple regression, what is a likely consequence of multicollinearity?

Explanation:
When predictors in a multiple regression are highly correlated, the precision of the estimated effects suffers. This multicollinearity makes it hard to separate the unique contribution of each predictor, so the standard errors of the coefficient estimates grow. As a result, the estimated coefficients become less reliable: confidence intervals widen and t-tests lose power, even if the predictors truly influence the outcome. You may also see the estimated coefficients become unstable across samples or flip signs with small changes in the data. It doesn’t change the dependent variable, nor does it reduce the sample size or eliminate correlations among predictors; it primarily undermines the precision and stability of the coefficient estimates.

When predictors in a multiple regression are highly correlated, the precision of the estimated effects suffers. This multicollinearity makes it hard to separate the unique contribution of each predictor, so the standard errors of the coefficient estimates grow. As a result, the estimated coefficients become less reliable: confidence intervals widen and t-tests lose power, even if the predictors truly influence the outcome. You may also see the estimated coefficients become unstable across samples or flip signs with small changes in the data. It doesn’t change the dependent variable, nor does it reduce the sample size or eliminate correlations among predictors; it primarily undermines the precision and stability of the coefficient estimates.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy