[SciPy-user] solving linear equations ?
Wed Nov 5 14:11:29 CST 2008
2008/11/5 Stef Mientki <email@example.com>:
> (forgive my math is a bit rusty, so I don't know the right terms anymore)
> If I want to solve a set of linear equations,
> I use in MatLab:
> a \ b
> this works also if I have too many equations, so more columns than rows.
> In Numpy for Matlab users
> I read this:
> linalg.solve(a,b) if a is square
> linalg.lstsq(a,b) otherwise
> I find the name already suspicious, sound like least square,
> which is confirmed by the help.
> So I guess the translation from MatLab to Numpy is not correct.
> Is there a function to reduce the number of columns / remove the
> redundancy, so I end up with a square matrix ?
Actually, MATLAB uses least-squares too, if the problem is
overdetermined (or underdetermined). It turns out that this is a very
good idea. If you have a problem with more equations than unknowns in
which some of the equations are redundant, then you could in principle
throw away the extra equations. But numerical solution of systems of
linear equations is a messy business, full of creeping roundoff error
and instabilities which can blow your solution away. So using a
least-squares approach to solve all the equations gives you a better
answer. If the equations *aren't* redundant, the least-squares
approach gets as close as possible to an answer.
MATLAB also works when you have more unknowns than equations; in this
case it gives you the smallest solution in a least-squares sense. This
too is a good idea for numerical reasons.
All this is done under the hood with the singular value decomposition,
which in fact uses a least-squares approach when throwing away
dimensions that have been badly corrupted by roundoff error.
All that said, it would still be nice to have at least a cookbook
example that implements a generic linsolve equivalent to MATLAB's
More information about the SciPy-user