[Numpy-discussion] Not enough storage for memmap on 32 bit Win XP for accumulated file size above approx. 1 GB
Charles R Harris
Thu Jul 23 09:28:28 CDT 2009
On Thu, Jul 23, 2009 at 7:48 AM, Kim Hansen <firstname.lastname@example.org> wrote:
> 2009/7/23 Charles R Harris <email@example.com>:
> >> Is it due to the 32 bit OS I am using?
> > It could be. IIRC, 32 bit windows gives user programs 2 GB of addressable
> > memory, so your files need to fit in that space even if the data is on
> > You aren't using that much memory but you are close and it could be that
> > other programs make up the difference. Maybe you can monitor the memory
> > get a better idea of the usage.
> > Chuck
> Hi Chuck,
> If I use the Windows task manager to see how much memory is used by
> the Python application when running the memmap test it says
> Before loading first memmap: 8.588 MB
> After loading first memmap: 8.596 MB
> i.e. only an additional 8 kB for having the 750 MB recarray available
> Maybe I am measuring memory usage wrong?
Hmm, I don't know what you should be looking at in XP. Memmapped files are
sort of like virtual memory and exist in the address space even if they
aren't in physical memory. When you address an element that isn't in
physical memory there is a page fault and the OS reads in the needed page
from disk. If you read through the file physical memory will probably fill
up because the OS will keep try to keep as many pages in physical memory as
possible in case they are referenced again. But I am not sure how windows
does it's memory accounting or how it is displayed, someone here more
familiar with windows may be able to tell you what to look for. Or you could
try running on a 64 bit system if there is one available.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion