[SciPy-dev] Generic polynomials class (was Re: Volunteer for Scipy Project)
Charles R Harris
Tue Oct 6 23:00:28 CDT 2009
On Tue, Oct 6, 2009 at 9:34 PM, Anne Archibald <email@example.com>wrote:
> 2009/10/6 Charles R Harris <firstname.lastname@example.org>:
> > On Tue, Oct 6, 2009 at 7:37 PM, Anne Archibald <
> > wrote:
> >> 2009/10/6 David Goldsmith <email@example.com>:
> >> > IMO this thread has matured to the point where someone(s) should
> >> > formally
> >> > propose a feature spec/object structure for community vetting. :-)
> >> http://www.scipy.org/NewPolynomials
> >> I don't propose an object structure, because I'm not sure how that
> >> should look, and I'm also not sure how reasonable my various
> >> requirements and restrictions are. On the other hand I did list at the
> >> end some references I found in a quick literature search; it appears
> >> the wisdom is that for general polynomial manipulation it's best to
> >> use the Bernstein basis (the same used in Bezier splines) or the
> >> Lagrange basis (representing polynomials by (x,y) pairs). In either
> >> case papers are available describing algorithms for all the basic
> >> polynomial operations.
> > Bernstein and Lagrange would both be defined on interval domains, [0,1]
> > [-1,1] respectively, and that would define both center and scaling if
> > arbitrary intervals are mapped to those domains, just like for the
> > polynomials. Hmm...
> Yes, I haven't found any discussion of polynomial bases for unbounded
> domains. It's not even clear to me what the right criterion for error
> is, since any polynomial basis is going to become large far enough
> away from the origin. If what one cared about were some sort of
> weighted mean squared error, then some of the classical orthogonal
> polynomials would be the right choice, but that has little relevance
> for questions like root-finding or max-difference error. I suppose
> some sort of weighted max-difference error might be usable - perhaps
> weighted by exp(-x**2) or some such thing - but without some serious
> numerical analysis it's not clear what to do there.
> I'd guess that for most practical applications a specific interval can
> be chosen. That does mean one needs to have parameterized bases, but
> is otherwise inoffensive.
> There's a comment in one of those references (I forget which just now)
> to the effect that the Chebyshev basis is a poor choice if one wants
> to find the roots of the resulting polynomial; on the other hand it's
> an excellent basis for finding approximating polynomials. This is the
> sort of thing that makes me want multiple representations.
There was something like that for the second Wilkinson polynomial, and
understandable as the wiggles got very small. All the zero finding
algorithms seemed to depend on some form of the companion matrix, but of
course the exact form of the companion matrix depended on the basis. The
Lagrange basis were parameterized by the sample points, so there are
actually lots, and lots, well, and lots of those basis. If the samples are
at the Chebyshev points they are just a well behaved transform away from
I wasn't able to get to the paper on division, which was one that really
roused my curiosity. Maybe the library will have it.
One of the papers didn't regard degree as important, which is probably
because 1) the Bernstein polynomials aren't graded by degree and 2) neither
are the Lagrange functions. And because it is difficult to define degree in
those contexts. At some point I think we will have to decide whether we want
"polynomial approximation", "orthogonal polynomials", or both. Both seem to
have their virtues.
Thanks for putting that page together, that helps get things organized. It
also grows what started as a simple Chebyshev class into something more
extensive ;) Oh well...
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Scipy-dev