Welcome!

Hello, Wallers, and welcome to Wikipedia! Thank you for your contributions. I hope you like the place and decide to stay. Here are some pages that you might find helpful:

I hope you enjoy editing here and being a Wikipedian! Please sign your name on talk pages using four tildes ~~~~, which will automatically produce your name and the date.

If you need help, check out Wikipedia:Questions, ask me on my talk page, or place {{helpme}} on your talk page and ask your question there. Again, welcome!

Addbot (talk) 02:10, 20 February 2009 (UTC)Reply

Considering changes to numerical methods for linear least squares page edit

I dislike the discussion of the QR decomposition to solve the linear least squares problem. I much prefer Susan Blackford's discussion on the Lapack pages: http://netlib.org/lapack/lug/node40.html. I'm considering changing it to something like this, in line with her discussion.

The QR decomposition allows us to decompose the matrix X into

 

where   is an   orthogonal matrix and   is an   upper triangular matrix. Because   is overdetermined,  , and we have a special case of the QR decomposition where

 .

The goal of the linear least squares solution is to find   which minimizes  . Multiplying by an orthogonal matrix does not alter the L2 norm, so

 

The upper portion can be solved for  

 ,

allowing us to calculate the predicted values

 .

The residual sum of squares can be either calculated by taking the difference of the predicted and actual values

 

or by realizing the residual sum of squares is equal to the portion which was independent of  

 


Wallers (talk) 15:49, 17 June 2010 (UTC)Reply