[SciPy-user] Arrayfns in numpy?

Robert Cimrman cimrman3@ntc.zcu...
Wed Mar 7 06:09:19 CST 2007


Dave wrote:
> On Tue, 06 Mar 2007 Travis Oliphant wrote:
>> That is generally what we believe as well.  The problem is that Numeric
>> already included several additional features.   Trying to maintain some
>> semblance of backward compatibility is why NumPy has not shrunk even more.
> 
>> Because the interp function was already in Numeric and is not that big,
>> perhaps it should be added to NumPy.
> 
> I just realized I have been struggling to install scipy, as it turns
> out, just to use an interpolation function.  I usually try to minimize
> dependencies for anything I plan to distribute to others.  My current
> experience seems to support arguments for both sides on whether to add
> interpolation to numpy.
> 
> As I see it there are three useful levels of array functionality.
> First, python itself should ideally allow for handling array objects
> with ease and efficiency, and promote "...one way to do it..." as per
> python dogma.  Hopefully ndarray added to the standard library will
> make this more of a reality some day.
> 
> Second there are common math functions and manipulations over arrays
> that are potentially useful for pedestrian as well as esoteric
> applications.   Sin, cos, basic integration and differentiation,
> matrix and vector math, simple statistics, etc., are all reasonably
> within this category.  Hobbyist game programmers as well as scientists
> can use them.  Basic 1d and perhaps 2d interpolation are also within
> this level of functionality in my opinion.  The capabilities and the
> API need to be stable and extremely reliable.  Numpy today provides a
> relatively rich and efficient package with many of these features and
> properties.
> 
> There is no hard cutoff but obscure (to me) probability distributions,
> simulation frameworks, image processing functions named after a living
> person, field-specific packages, etc, are all wonderful to have when
> you need them but are best suited for the third level of
> functionality, scipy.  Sophisticated packages will have dependencies
> and might be updated often or have API changes as capabilities evolve.
>  It makes sense to isolate this from the more generic and stable
> features in numpy.
> 
> Included in numpy for the important practical goal of backward
> compatibility are objects like, I would guess, numpy.kaiser and others
> that don't seem to fit well the stated divisions.  If interpolation is
> in Numeric then numpy is at least deserving of a compatibility
> function.  But as a user without legacy requirements I would prefer
> one designed for numpy and for future applications.
> 
> The other aspect of the argument concerns the dependency issue.  I
> have not been able to use the scipy.interpolate module because of what
> appears to be some dependency that I can't resolve.  I certainly don't
> want the problem moved to numpy and I sympathize very much with "less
> is more" arguments in this respect.  So I would like to have available
> basic interpolation features in numpy if the functions can be added
> without adding new dependencies.  In general I don't see a problem
> with adding select numerical features if it is done carefully and with
> costs/benefits of the whole package in mind.
> 
> In summary:
> +1 interpolation for numpy,
> -1 new dependencies for numpy,
> +1 balanced practical approach to adding numpy features

As Dave correctly mentioned, there are three levels of functionality:
1. core of numpy (ndarray) - basic array handling
2. 'general purpose' functions (like interp, linalg.*, ...) with no
external dependencies, no fortran, easy to install
3. full scipy

Currently, 1. and 2. are in one package (numpy) - why not make two?

r.



More information about the SciPy-user mailing list