[SciPy-dev] Generic polynomials class (was Re: Volunteer for Scipy Project)

Anne Archibald peridot.faceted@gmail....
Tue Oct 6 10:02:17 CDT 2009

```2009/10/6 Charles R Harris <charlesr.harris@gmail.com>:
>
>
> On Mon, Oct 5, 2009 at 10:14 PM, Anne Archibald <peridot.faceted@gmail.com>
> wrote:
>>
>> 2009/10/5 Charles R Harris <charlesr.harris@gmail.com>:
>> >
>> >
>> > On Mon, Oct 5, 2009 at 7:29 PM, Anne Archibald
>> > <peridot.faceted@gmail.com>
>> > wrote:
>> >>
>> >> 2009/10/5 Charles R Harris <charlesr.harris@gmail.com>:
>> >> >
>> >> >
>> >> > On Mon, Oct 5, 2009 at 3:20 PM, Anne Archibald
>> >> > <peridot.faceted@gmail.com>
>> >> > wrote:
>> >> >> For examples where I think a bit of lit review plus implementation
>> >> >> work might help, I'd say that the orthogonal polynomials could use
>> >> >> some work - the generic implementation in scipy.special falls apart
>> >> >> rapidly as you go to higher orders. I always implement my own
>> >> >> Chebyshev polynomials using the cos(n*arccos(x)) expression, for
>> >> >> example, and special implementations for the others might be very
>> >> >> useful.
>> >> >>
>> >> >
>> >> > At what order does the scipy implementation of the Chebyshev
>> >> > polynomials
>> >> > fall apart? I looked briefly at that package a long time ago, but
>> >> > never
>> >> > used
>> >> > it. I ask so I can check the chebyshev module that is going into
>> >> > numpy.
>> >>
>> >> By n=30 they are off by as much as 0.0018 on [-1,1]; n=31 they are off
>> >> by 0.1, and by n=35 they are off by four - not great for values that
>> >> should be in the interval [-1,1]. This may seem like an outrageously
>> >> high degree for a polynomial, but there's no reason they need be this
>> >> bad, and one could quite reasonably want to use an order this high,
>> >> say for function approximation.
>> >>
>> >
>> > Hmm, I get an maximum error of about 1e-13 for n=100 using my routine,
>> > which
>> > isn't great and can be improved a bit with a few tricks, but is probably
>> > good enough.  That's using simple Clenshaw recursion for the evaluation.
>> > There must be something seriously wrong with the scipy version. I
>> > routinely
>> > use fits with n > 50 because truncating such a series gives a good
>> > approximation to the minmax fit and it's also nice to see how the
>> > coefficients fall off to estimate the size of the truncation error.
>>
>> Upon closer inspection, it seems that scipy starts from the recurrence
>> relation but then uses it only to extract roots, which it then uses to
>> construct a poly1d instance. (Which presumably then goes on to convert
>> it into the 1, x, x**2, ... basis.) This is numerically disastrous.
>>
>> The reason it does this is because scipy doesn't (didn't) have a
>> general-purpose polynomial class that can keep the polynomial in other
>> representations. What do you think of the idea of a polynomial class
>> that supports several different bases for the set of polynomials? (At
>> least, the power basis and Chebyshev polynomials, but possibly also
>> the other orthogonal polynomials. Translated and scaled bases might
>> come in handy too.) Too general? Not general enough (barycentric
>> representations, representation in terms of roots)?
>
> Barycentric representation using the type II Chebyshev points would be the
> way to go for *all* the functions without singularities at the ends, except
> I couldn't figure out how to get the remainder during division, and the
> remainder is needed for converting to the coefficients of the various types
> of series -- it's just like converting a decimal representation to binary by
> repeated division by two and keeping the remainders. That doesn't mean it
> can't be done, however...

We even already have compiled code for evaluation. But there are
several polynomial operations I can't see an easy way to handle using
this. It seems to me that with a polynomial class one should be able
to easily:

* evaluate
* multiply by another polynomial
* divide with remainder
* extract derivatives and antiderivatives, both at a point and as polynomials
* express in another polynomial basis
* measure degree
(Have I missed any?)

With barycentric interpolation, it's not clear how to divide or take
antiderivatives (and hence converting bases becomes awkward). On the
other hand, multiplication is O(n) rather than O(n^2). And I think
antidifferentiation could be worked out with some cleverness.

Nevertheless I think one would want to be able to work with orthogonal
polynomial bases, for example when dealing with an infinite (or
unspecified) interval. Plus one needs to record all the various extra
information about each family of orthogonal polynomials (roots,
weights for integration, etc.)

Anne
```