[Numpy-discussion] Not enough storage for memmap on 32 bit WinXP for accumulated file size above approx. 1 GB
Fri Jul 24 05:55:36 CDT 2009
>> I tried adding the /3GB switch to boot.ini as you suggested:
>> multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP
>> Professional" /noexecute=optin /fastdetect /3GB
>> and rebooted the system.
>> Unfortunately that did not change anything for me. I still hit a hard
>> deck around 1.9 GB. Strange.
> The 3Gb thing only works for application specifically compiled for it:
> I somewhat doubt python is built with this, but you could check this in
> python sources to be sure,
Ahh, that explains it. Thank you for that enlightening link. Anyway
would it not be worth mentioning in the memmap documentation that
there is this 32 bit limitation, or is it so straightforwardly obvious
(it was not for me) that his is the case?
The reason it isn't obvious for me is because I can read and
manipulate files >200 GB in Python with no problems (yes I process
that large files), so I thought why should it not be capable of
handling quite large memmaps as well...
More information about the NumPy-Discussion