[Numpy-discussion] A memory problem: why does mmap come up in numpy.inner?

Dan Yamins dyamins@gmail....
Wed Jun 4 19:42:10 CDT 2008


I'm using python 2.5.2 on OS X, with 8 GB of ram, and a 64-bit processor.
In
this, setting, I'm working with large arrays of binary data.  E.g, I want to
make calls like:
               Z = numpy.inner(a,b)
where and b are fairly large  -- e.g. 20000 rows by 100 columns.

However, when such a call is made, I get a memory error that I don't
understand.
Specifically:

>>> s = numpy.random.binomial(1,.5,(20000,100))   #creates 20000x100 bin.
array
>>> r = numpy.inner(s,s)
Python(1714) malloc: *** mmap(size=1600000000) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug

Naively, the numpy.inner call should be fine on my system, since my computer
has
enough memory. (And, when it's run, I have checked to see that at least 5 GB
of
RAM is free.)  The error message thus suggests there's some problem to do
with
memory mapping going on here: that somehow, numpy.inner is calling the mmap
modeul, and that the address space is being exceeded.  And that all my extra
RAM
isn't being used here at all.

So, I have three questions about this:
    1) Why is mmap being called in the first place?  I've written to Travis
Oliphant, and he's explained that numpy.inner does NOT directly do any
memory
mapping and shouldn't call mmap.  Instead, it should just operate with
things in
memory -- in which case my 8 GB should allow the computation to go through
just
fine.  What's going on?

    2) How can I stop this from happening?  I want to be able to leverage
large
amounts of ram on my machine to scale up my computations and not be
dependent on
the limitations of the address space size.  If the mmap is somehow being
called
by the OS, is there some option I can set that will make it do things in
regular
memory instead?  (Sorry if this is a stupid question.)

    3) Even if I had to use memory mapping, why is the 1.6 GB requirement
failing?  I'm using a recent enough version of python, and I have a 64-bit
processor with sufficient amount of memory.  I should be able to allocate at
least 4 GB of address space, right?  But the system seems to be balking at
the
1.6 GB request. (Again, sorry if this is stupid.)

Any help would be greatly appreciated!  Thanks,
Dan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/numpy-discussion/attachments/20080604/852030bc/attachment-0001.html 


More information about the Numpy-discussion mailing list