COMPGROUPS.NET | Search | Post Question | Groups | Stream | About | Register

### linear least squares

• Follow

```It is a math question, but I guess some people here know how linear
least squares work, so I'll take a chance.

Linear least squares minimizes difference ||y - X * b||^2
where X is matrix with 'observations', y is data being predicted/fitted,
and b is predictor/estimator vector to be found.

Estimator b can be found using pseudo-inverse of matrix X:

b = (X+) * yIf we have many y vectors (say, y_1, y_2...)

Now if we have multiple y vectors (say, y_1, y_2...) we can compute X+
just once to find b_1, b_2 etc.

But what if I have multiple X matrices, say, X_1, X_2... which are
related to X in some way -- for example, X_1 is X with one extra column.
Is there a way to avoid computing pseudo-inverse (or decomposition) for
each of these matrices?

I.e. precompute something for matrix X and then do a 'cheap'
manipulation for each modified matrix...
```
 0
Reply alex.mizrahi (227) 7/12/2011 10:34:18 PM

```> But what if I have multiple X matrices, say, X_1, X_2... which are
> related to X in some way -- for example, X_1 is X with one extra column.
> Is there a way to avoid computing pseudo-inverse (or decomposition) for
> each of these matrices?

Hmm, I think SVD allows one to add rows relatively cheaply using
projection and reorthogonalization, but I have very vague idea...

Another question about linear least squares: how does it work with
quantization of observations (X) data?
Obviously, I could make estimator for quantized data, but what if I have
to build estimator for precise data and use it on quantized one?
Basically, I want it to work for a _range_ of different quantization
steps. Building estimator for each possible step is not an option.
Can I expect some nice behavior here?

http://en.wikipedia.org/wiki/Total_least_squares
But it seems to be an overkill here and not quite helpful.
Or am I missing something and it is the right way to do it?
```
 0
Reply alex.mizrahi (227) 7/13/2011 8:01:37 AM

1 Replies
32 Views

5/26/2013 1:10:40 AM