[SciPy-User] Strange memory limits

Chris Weisiger cweisiger@msg.ucsf....
Tue Mar 29 13:12:39 CDT 2011

On Tue, Mar 29, 2011 at 10:39 AM, Christoph Gohlke <cgohlke@uci.edu> wrote:

> Try VMMap
> <http://technet.microsoft.com/en-us/sysinternals/dd535533.aspx>. The
> software lists, among other useful information, the sizes of contiguous
> blocks of memory available to a process. You'll probably find that 64
> bit Python lets you use a much larger contiguous block than 32 bit Python.
> It could help to create large numpy arrays early in the program, e.g.
> before importing packages or creating other arrays.
Ah, thanks. Looks like there's some very loosely-packed "image" allocations
at one end of the heap that are basically precluding allocations of large
arrays in that area without actually using up all that much total memory. I
wonder if maybe they're for imported Python modules...well, at least now I
have a tool to help me figure out where memory's going. The right answer's
probably still to just make a 64-bit version though.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/scipy-user/attachments/20110329/219a1421/attachment.html 

More information about the SciPy-User mailing list