[Numpy-discussion] memory usage
Tue Oct 14 17:11:01 CDT 2008
On Tue, Oct 14, 2008 at 17:02, emil <firstname.lastname@example.org> wrote:
> I'm having a problem with my python code, using numpy, chewing up too
> much memory.
> In the following, I boiled down my program to the simplest example that
> has the problem:
> from numpy import *
> for i in range(1000):
> a = random.randn(512**2)
> b = a.argsort(kind='quick')
> This loop takes a couple of minutes to run on my machine.
> While running 'top' concurrently, I see that the memory usage is
> increasing as the loop progresses. By the time the loop is finished
> the python process is taking over 30% of the memory, and I have 8 GB RAM.
> Is there some way to prevent this from happening?
> It's fine if the alternative slows the code down a bit.
> Im'm using python2.4 and numpy 1.0.1
Can you try upgrading to numpy 1.2.0? On my machine with numpy 1.2.0
on OS X, the memory usage is stable.
"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
-- Umberto Eco
More information about the Numpy-discussion