Thursday, January 2, 2014

Zero-One Matrices

When we're learning the basics of least squares regression analysis, one of the topics that we invariably encounter is the consequences of model mis-specification. In particular, we're taught that omitting relevant regress from the model renders the OLS estimator biased and inconsistent, although its precision is improved. On the other hand, including extraneous regressors simply reduces the efficiency of the OLS estimator of the coefficient vector. That estimator is still unbiased (and consistent) in this case.

These results are just special cases of those associated with imposing false restrictions on the parameter space, or failing to impose valid restrictions. So, once these more general results have been covered there's really no need to treat the "omitted regressors" and "extraneous regressors" situations as a separate matter.

However, usually they are dealt with as a distinct topic. What I find interesting, and what I want to focus on here, is the way in which the unbiasedness of OLS can be demonstrated in the context of irrelevant regressors. There's an easy way to get this result, and there's a more tedious proof. Let's begin by looking at the easy way.