[SciPy-User] Strange memory limits
Charles R Harris
Mon Mar 28 22:01:05 CDT 2011
On Mon, Mar 28, 2011 at 4:22 PM, Chris Weisiger <email@example.com>wrote:
> (This is unrelated to my earlier question about 2D data slicing)
> We have a 32-bit Windows program that has Python bindings which do most of
> the program logic, reserving the C++ side for heavy lifting. This program
> needs to reserve buffers of memory to accept incoming image data from our
> different cameras -- it waits until it has received an image from all active
> cameras, then saves the image to disk, repeat until all images are in. So
> the Python side uses numpy to allocate a block of memory, then hands it off
> to the C++ side where images are written to it and then later stored.
> Ordinarily all of our cameras are operating in sync so the delay between the
> first and last cameras is small, so we can keep the memory buffer small. I'm
> working on a modified data collection mode where each camera does a lengthy
> independent sequence, though, requiring me to either rewrite the data saving
> system or simply increase the buffer size.
> Increasing the buffer size works just fine until I try to allocate about a
> 3x735x512x512 array (camera/Z/X/Y) of 16-bit ints, at which point I get a
> MemoryError. This is only a bit over 1GB worth of memory (out of 12GB on the
> computer), and according to Windows' Task Manager the program was only using
> about 100MB before I tried the allocation -- of course, I've no idea how the
> Task Manager maps to how much RAM I've actually requested. So that's a bit
> strange. I ought to have 4GB worth of space (or at the very least 3GB),
> which is more than enough for what I need.
> Windows 32 bit gives you 2GB and keeps the rest for itself.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User