Week 6 - Multiple Linear Regression

Content for week of Monday, October 4, 2021–Friday, October 8, 2021

Overview

Let’s model!

Now, we can build powerful models with heaps of dependent variables. Want to predict wages? Let’s control for education, for experience, for gender, for age, for age squared (yes!). YES. Only our degrees of freedom can hold us back.

Reading Guide

Chapter 6: Linear Regression with Multiple Regressors

SW 6.1 Omitted Variable Bias

A discussion that connects nicely with our previous discussion of the zero conditional mean discussion and causal inference.

SW 6.2 The Multiple Regression Model

Hooray!

SW 6.3 The OLS Estimator in Multiple Regression

This section doesn’t get into derivation, and neither do we!

SW 6.4 Measures of Fit in Multiple Regression

The only new thing here is a revised SER forumla and the introduction of the Adjusted R2. Note that the lecture video also discusses the root mean standard error, RMSE, which is a lot like the SER except that it uses n rather than degrees of freedom as a denominator.

SW 6.5 The Least Squares Assumptions in Multiple Regression

Take the three from univariate regression and add … no multicollinearity. Sorted.

SW 6.6 Distribution of the OLS Estimators in Multiple Regression

Just the intuition, don’t worry about the appendix.

SW 6.7 Multicollinearity

Make sure you understand the examples, but remember that in practice, any statistical package will fix perfect multicollinearity on its own. Imperfect multicollinearity, on the other hand, is something to think about when crafting your models.

SW 6.8 Conclusion

Treat yourself.

In-class exercises

Download PDF here, which contains regression output for questions (2) - (4)

Exercises

Consider a dataset on earnings in the United States. We are interested in the returns to education - how much an extra year of schooling “buys” you in terms of weekly wages (...as of 1980). You’re also worried about whether one’s education suffers from omitted variable bias.

  1. You estimate two equations: wage^=146.95+60.21educeduc^=5.84+0.075IQ

    Based on these results, is 60.21 an overestimate or underestimate of the returns to education? How do you know?

  2. You estimate another equation: wage^=128.89+42.06educ+5.14IQ

    What is the interpretation of the coefficient on educ? What is the interpretation of the constant?

  3. Now, you control for experience and age and estimate the following population regression model:

    wagei=β0+β1educi+β2IQi+β3experi+β4agei+β5agei2+ui

    A one-year increase in age is associated with what change in wages? (mind the squared term)

  1. Finally, because you are worried about omitted variable bias, you include father’s and mother’s education.

    1. Why might parent’s education might directly affect wages?

    2. Which other independent variables do you think parent’s education might affect? Explain.

    3. How did controlling for parent’s education affect the returns to education? The returns to IQ?

Other resources