[Numpy-discussion] A memory problem: why does mmap come up in numpy.inner?
Charles R Harris
Wed Jun 4 20:06:12 CDT 2008
On Wed, Jun 4, 2008 at 6:42 PM, Dan Yamins <firstname.lastname@example.org> wrote:
> I'm using python 2.5.2 on OS X, with 8 GB of ram, and a 64-bit processor.
> this, setting, I'm working with large arrays of binary data. E.g, I want
> make calls like:
> Z = numpy.inner(a,b)
> where and b are fairly large -- e.g. 20000 rows by 100 columns.
> However, when such a call is made, I get a memory error that I don't
> >>> s = numpy.random.binomial(1,.5,(20000,100)) #creates 20000x100 bin.
> >>> r = numpy.inner(s,s)
> Python(1714) malloc: *** mmap(size=1600000000) failed (error code=12)
> *** error: can't allocate region
> *** set a breakpoint in malloc_error_break to debug
> Naively, the numpy.inner call should be fine on my system, since my
> computer has
> enough memory. (And, when it's run, I have checked to see that at least 5
> GB of
> RAM is free.) The error message thus suggests there's some problem to do
> memory mapping going on here: that somehow, numpy.inner is calling the mmap
> modeul, and that the address space is being exceeded. And that all my
> extra RAM
> isn't being used here at all.
> So, I have three questions about this:
> 1) Why is mmap being called in the first place? I've written to Travis
> Oliphant, and he's explained that numpy.inner does NOT directly do any
> mapping and shouldn't call mmap. Instead, it should just operate with
> things in
> memory -- in which case my 8 GB should allow the computation to go through
> fine. What's going on?
> 2) How can I stop this from happening? I want to be able to leverage
> amounts of ram on my machine to scale up my computations and not be
> dependent on
> the limitations of the address space size. If the mmap is somehow being
> by the OS, is there some option I can set that will make it do things in
> memory instead? (Sorry if this is a stupid question.)
> 3) Even if I had to use memory mapping, why is the 1.6 GB requirement
> failing? I'm using a recent enough version of python, and I have a 64-bit
> processor with sufficient amount of memory. I should be able to allocate
> least 4 GB of address space, right? But the system seems to be balking at
> 1.6 GB request. (Again, sorry if this is stupid.)
Are both python and your version of OS X fully 64 bits?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Numpy-discussion