Linear regression is one of the powerhouse statistical analysis techniques, and I'm always looking for material that improves my understanding of the approach. Not necessarily the main algorithms, but more so the results and what they mean.
So here are a few documents that give a more detailed explanation of linear regression than I've seen so far. It's by no means an exhaustive summary; I'm sure you can find better ones on the web.
An Introduction to Regression Analysis, by Alan Sykes at the University of Chicago School of Law. Some really good explanation of the regressors and their meaning, which is useful when it comes to explaining the results. Also some explanation of goodness-of-fit and hypothesis testing.
Another excellent MIT Sloan course, this time on Data Mining. There are a couple of lecture notes on the use of multiple regression in data mining — a little advanced, but a solid description of how you might want to use a (relatively) simple technique to answer some sophisticated questions.
From my alma mater, course notes on multiple regression in an Applied Statistics course. The notes looked similar to the multiple regression notes in the MIT Data Mining course. And some more notes on simple regression and some diagnostic techniques.
Fantastic course notes on applied linear regression by Jamie DeCoster at the University of Alabama. It starts simply with some statistical review, and progresses into advanced diagnostic techniques such as outlier and multicollinearity analysis.
Like I said, there are tons of lecture notes out there, so you can find one that is best suited for you, but this list is enough to get you started. Now, go fill that toolbox!