Sunday, March 23, 2008

Classic Linear Regression Assumptions

The assumptions of the classic linear regression model are the following:


  1. A linear regression exists between the dependent variable and the independent variable.

  2. The independent variable is not random.

  3. The expected value of the error term is 0.

  4. The variance for the error term is the same for all observations. (Homoskedasticity)

  5. The error term is normally distributed.

  6. The error term is uncorrelated across observations.



Practice more, then you will learn :)

2 comments:

Will Dwinnell said...

Though not technically an assumption, the implication of least squares regression is that the analyst cares about squared errors. In my experience, this is often accepted as a "given", though it is not at all necessary for linear regression.

Least absolute error (LAE) regression (also called "L-1 regression"), as the name implies, minimizes the absolute error of the model, which may be better suited to the analysis at hand. I describe LAE regression briefly and a MATLAB implementation in my posting, L-1 Linear Regression.

Yet other regression procedures ("robust" regressions) minimize various error measures designed to be resistant to the effect of outliers.

Eddy said...

Hey Will, thanks for leaving comments.

I'm totally agreed with you. Since that the post is about classic assumptions of regression model, well, generally it implies too the classical usage of OLS. :p. Just joking.

The bottomline is there are too many measurement functions to be chosen for the prediction error and whichever we selected to use, make sure we know the assumptions among other things about it.

Cheers.