[Numpy-discussion] the direction and pace of development
perry at stsci.edu
Thu Jan 22 19:25:01 CST 2004
Colin J. Williams writes:
> I have wondered whether the desire to be compatible with Numeric has
> been an inhibitory factor for numarray. It might be interesting to see
> the list of decisions which Eric Jones doesn't like.
There weren't that many. The ones that I remember (and if Eric has
time he can fill in the rest) were:
1) default axis for operations. Some use the last and some use the
first depending on context. Eric and Travis wanted to use a
consistent rule (I believe last always). I believe that
scipy wraps Numeric so that it does just that (remember, the
behavior in scipy of Numeric is not quite the same as the
distributed Numeric (correct me if I'm wrong).
2) allowing complex comparisons. Since Python no longer allows these
(and it is reasonable to question whether this was right since
complex numbers now can no longer be part of a generic python sort),
Many felt that numarray should be consistent with Python. This isn't
a big issue since I had argued that those that wanted to do generic
comparisons simply needed to cast it as x.real where the .real
attribute was available for all types of arrays, thus using that
would always work regardless of the type.
3) having single-element indexing return a rank-0 array rather
than a python scalar. Numeric is quite inconsistent in this
regard now. We decided to have numarray always return python
scalars (exceptions may be made if Float128 is supported).
The argument for rank-0 arrays was that it would support generic
programming so that one didn't need to test for the kind of
value for many functions (i.e., scalar or array). But the
issue of contention was that Eric argued that len(rank-0) == 1
and that (rank-0) give the value, neither of which is
correct according to the strict definition of rank-0. We argued
that using rank-1 len-1 arrays were really what was needed for
that kind of programming. It turned out that the most common
need was for the result of reduction operations, so we provided
a version of reduce (areduce) which always returned an array
result even if the array was 1-d, (the result would be a length-1
There are others, but I don't recall immediately.
> > It is not the interface but the implementation that started this
> > furor. Travis O.'s suggestion was to back port (much of) the numarray
> > interface to the Numeric code base so that those stuck supporting
> > large co debases (like SciPy) and needing fast small arrays could
> > benefit from the interface enhancements. One or two of them had
> > backward compatibility issues with Numeric, so he asked how it should
> > be handled. Unless some magic porting fairy shows up, SciPy will be a
> > Numeric only tool for the next year or so. This means that users of
> > SciPy either have to forgo some of these features or back port.
> Back porting would appear, to this outsider, to be a regression. Is
> there no way of changing numarray so that it has the desired speed for
> small arrays?
If it must be faster than Numeric, I do wonder if that is easily
done without greatly complicating the code.
> I am surprised that alltrue() performance is a concern, but it should be
> easy to implement short circuit evaluation so that False responses are,
> on average, handled more quickly. If Boolean arrays are significant,
> in terms of the amount of computer time taken, should they be stored as
> bit arrays? Would there be a pay-off for the added complexity?
Making alltrue fast in numarray would not be hard. Just some work
writing a special purpose function to short circuit. I doubt very
much bit arrays would be much faster. They would also greatly complicate
the code base. It is possible to add them, but I've always felt the
reason would be to save memory, not increase speed. They haven't
been high priority for us.
More information about the Numpy-discussion