[SciPy-dev] numpy - dual.py problems
Arnd Baecker
arnd.baecker at web.de
Sun Jan 8 05:38:30 CST 2006
Hi Travis,
On Sat, 7 Jan 2006, Travis Oliphant wrote:
> Arnd Baecker wrote:
>
> >Hi,
> >
> >A typical scenario for "end-users" is the following:
> >- people will have Numeric/numarray + old scipy/old new scipy
> > on their machines.
> >
> > In many cases this is the system-wide installation as done by
> > the root-user (eg. via some package manager)
> >
> > The "end-user" has no root rights.
> >- The "end-user" hears about the great progress wrt numpy/scipy combo
> > and wants to test it out.
> >
> > He downloads numpy and installs it to some place in his
> > homedirectory via
> >
> > python setup.py install --prefix=<~/somewhere>
> >
> > and sets his PYTHONPATH accordingly
> >- Then `import numpy` will work, but a
> > `numpy.test(10)` will fail because `dual.py`
> > picks his old scipy (which will be visible from the
> > traceback, if he looks carefully at the path names).
> >
> >- Consequence, the "end-user" will either ask a question on the
> > mailing list or just quit his experiment and continue
> > to work with his old installation.
> >
> This has been fixed now so that it will only use scipy if it can find
> version 0.4.4 or higher...
Great - I think this will prevent a lot of problems!
> >Later on Travis' wrote:
> >"""The solution is that now to get at functions that are in both numpy and
> >scipy and you want the scipy ones first and default to the numpy ones if
> >scipy is not installed, there is a numpy.dual module that must be
> >loaded separately that contains all the overlapping functions."""
> >
> >I think this is fine, if a user does this in his own code,
> >but I have found the following `from numpy.dual import`s within numpy
> > core/defmatrix.py: from numpy.dual import inv
> > lib/polynomial.py: from numpy.dual import eigvals, lstsq
> > lib/mlab.py:from numpy.dual import eig, svd
> > lib/function_base.py: from numpy.dual import i0
> >
> BTW, these are all done inside of a function call.
Yes - I saw that.
> I want to be able
> to use the special.i0 method when its available inside numpy (for kaiser
> window). I want to be able to use a different inverse for matrix
> inverses and better eig and svd for polynomial root finding.
Are these numerically better, or just faster?
I very much understand your point of view on this
(until Fernando's mail I would have silently agreed ;-).
On the other I think that Fernando's point,
that the mere installation of scipy
will change the behaviour of numpy implicitely,
without the user being aware of this
or having asked for the change.
Now, it could be that this works fine in 99.9% of
the cases, but if it does not, it might
be very hard to track down.
So I am still thinking that something like a
numpy.enable_scipy_functions()
might be a better approach.
Let me give another example:
when we use highly optimized routines
for parallel computers (namely `scsl`)
for linear algebra, the whole things screws
up (just hangs), if the number of CPUs is not set before
by the environment variable
OMP_NUM_THREADS.
Assume we have some code which should run on just
one cpu and should just use numpy.
Then the implicit imports brings in the other
library and will hang.
> So, I don't see this concept of enhancing internal functions going
> away. Now, I don't see the current numpy.dual approach as the
> *be-all*. I think it can be improved on. In fact, I suppose some
> mechanism for registering replacement functions should be created
> instead of giving special place to SciPy. SciPy could then call these
> functions. This could all be done inside of numpy.dual. So, I think
> the right structure is there....
Anyway, sorry if I am wasting your time with this
discussion, I don't feel too strongly about this point
(especially after the version check),
maybe Fernando would like to add something -
also I have to move on to other stuff
(but whom do I tell that ;-).
Best, Arnd
More information about the Scipy-dev
mailing list