[Numpy-discussion] NumPy re-factoring project
Fri Jun 11 10:17:29 CDT 2010
On 11 June 2010 11:12, Benjamin Root <firstname.lastname@example.org> wrote:
> On Fri, Jun 11, 2010 at 8:31 AM, Sturla Molden <email@example.com> wrote:
>> It would also make sence to evaluate expressions like "y = b*x + a"
>> without a temporary array for b*x. I know roughly how to do it, but
>> don't have time to look at it before next year. (Yes I know about
>> numexpr, I am talking about plain Python code.)
> If I may chime in here with my own experience with NumPy code...
> I typically use older, "weaker" computers for my work. I am not doing
> real-time modeling or some other really advanced, difficult computations.
> For me, NumPy works "fast enough", even on an EeePC. My main issue is the
> one given above by Sturla. I find that NumPy's memory usage can go
> out-of-control very easily in long mathematical expressions. With a mix of
> constants and large matricies, each step in the order of operations seems to
> take up more memory. Often, I would run into a major slow-down from
> trashing the swap. This is fairly trivial to get around by operating over
> slices of the matrices at a time, but -- to me -- all of this talk about
> optimizing the speed of the operations without addressing the temporaries
> issue is like trying to tune-up the engine of a car without bothering to
> take the lead weights out of the trunk.
I should say, though, that I've gone through the process of removing
all temporary allocation using ufunc output arguments (np.add(a,b,c))
only to discover that it didn't actually save any memory and it was
slower. The nice thing about temporaries is that they're, well,
temporary; they go away.
On the other hand, since memory reads are very slow, optimizations
that do more calculation per load/store could make a very big
difference, eliminating temporaries as a side effect.
> Just my 2 cents.
> Ben Root
> NumPy-Discussion mailing list
More information about the NumPy-Discussion