- Published in print:
- 2011
- Published Online:
- June 2013
- ISBN:
- 9780804772624
- eISBN:
- 9780804777209
- Item type:
- chapter

- Publisher:
- Stanford University Press
- DOI:
- 10.11126/stanford/9780804772624.003.0005
- Subject:
- Economics and Finance, Econometrics

This chapter discusses the reasons why we can generalize from what we observe in one sample to what we might expect in others. It defines the population, distinguishes between parameters and ...
More

This chapter discusses the reasons why we can generalize from what we observe in one sample to what we might expect in others. It defines the population, distinguishes between parameters and estimators, and discusses why we work with samples when populations are what we are interested in. It demonstrates that, with the appropriate assumptions about the structure of the population, a and b, as calculated in Chapter 4, are best linear unbiased (BLU) estimators of the corresponding parameters in the population relationship. Under these population assumptions, regression is frequently known as ordinary least squares (OLS) regression.Less

This chapter discusses the reasons why we can generalize from what we observe in one sample to what we might expect in others. It defines the population, distinguishes between parameters and estimators, and discusses why we work with samples when populations are what we are interested in. It demonstrates that, with the appropriate assumptions about the structure of the population, *a* and *b*, as calculated in Chapter 4, are best linear unbiased (BLU) estimators of the corresponding parameters in the population relationship. Under these population assumptions, regression is frequently known as ordinary least squares (OLS) regression.

- Published in print:
- 2011
- Published Online:
- June 2013
- ISBN:
- 9780804772624
- eISBN:
- 9780804777209
- Item type:
- chapter

- Publisher:
- Stanford University Press
- DOI:
- 10.11126/stanford/9780804772624.003.0012
- Subject:
- Economics and Finance, Econometrics

This chapter begins by deriving the variances of the slopes from a given equation. These are used first to address the issue of multicollinearity. It then turns to the issue of interpreting ...
More

This chapter begins by deriving the variances of the slopes from a given equation. These are used first to address the issue of multicollinearity. It then turns to the issue of interpreting regression results. The slopes obtained when the sum of squared errors for the regression of the same equation is minimized are best linear unbiased (BLU) estimates of the population coefficients. If the population relationship includes two explanatory variables, the precision of these slopes depends heavily on the extent to which the two explanatory variables are related. Including an irrelevant variable is inefficient, but does not create bias. Everything that was done in Chapters 8 through 10 holds with two explanatory variables, either exactly or with minor, sensible extensions.Less

This chapter begins by deriving the variances of the slopes from a given equation. These are used first to address the issue of multicollinearity. It then turns to the issue of interpreting regression results. The slopes obtained when the sum of squared errors for the regression of the same equation is minimized are best linear unbiased (BLU) estimates of the population coefficients. If the population relationship includes two explanatory variables, the precision of these slopes depends heavily on the extent to which the two explanatory variables are related. Including an irrelevant variable is inefficient, but does not create bias. Everything that was done in Chapters 8 through 10 holds with two explanatory variables, either exactly or with minor, sensible extensions.

- Published in print:
- 2011
- Published Online:
- June 2013
- ISBN:
- 9780804772624
- eISBN:
- 9780804777209
- Item type:
- chapter

- Publisher:
- Stanford University Press
- DOI:
- 10.11126/stanford/9780804772624.003.0011
- Subject:
- Economics and Finance, Econometrics

This chapter shows that if the population relationship includes two explanatory variables, but the sample regression contains only one, then the estimate of the effect of the included variable is ...
More

This chapter shows that if the population relationship includes two explanatory variables, but the sample regression contains only one, then the estimate of the effect of the included variable is almost surely biased. The best remedy is to include the omitted variable in the sample regression. Minimizing the sum of squared errors from a regression with two explanatory variables yields two slopes, each of which represents the relationship between the parts of the dependent variable and the associated explanatory variable that are not related to the other explanatory variable. These slopes are unbiased estimators of the population coefficients.Less

This chapter shows that if the population relationship includes two explanatory variables, but the sample regression contains only one, then the estimate of the effect of the included variable is almost surely biased. The best remedy is to include the omitted variable in the sample regression. Minimizing the sum of squared errors from a regression with two explanatory variables yields two slopes, each of which represents the relationship between the parts of the dependent variable and the associated explanatory variable that are not related to the other explanatory variable. These slopes are unbiased estimators of the population coefficients.

- Published in print:
- 2011
- Published Online:
- June 2013
- ISBN:
- 9780804772624
- eISBN:
- 9780804777209
- Item type:
- chapter

- Publisher:
- Stanford University Press
- DOI:
- 10.11126/stanford/9780804772624.003.0014
- Subject:
- Economics and Finance, Econometrics

This chapter shows that the addition of a second explanatory variable in Chapter 11 adds only four new things to what there is to know about regression. First, regression uses only the parts of each ...
More

This chapter shows that the addition of a second explanatory variable in Chapter 11 adds only four new things to what there is to know about regression. First, regression uses only the parts of each variable that are unrelated to all of the other variables. Second, omitting a variable from the sample relationship that appears in the population relationship almost surely biases our estimates. Third, including an irrelevant variable does not bias estimates but reduces their precision. Fourth, the number of interesting joint tests increases with the number of slopes. All four remain valid when we add additional explanatory variables.Less

This chapter shows that the addition of a second explanatory variable in Chapter 11 adds only four new things to what there is to know about regression. First, regression uses only the parts of each variable that are unrelated to all of the other variables. Second, omitting a variable from the sample relationship that appears in the population relationship almost surely biases our estimates. Third, including an irrelevant variable does not bias estimates but reduces their precision. Fourth, the number of interesting joint tests increases with the number of slopes. All four remain valid when we add additional explanatory variables.