[SciPy-user] 64 bit Address Space Limitations

Andrew Straw strawman at astraw.com
Tue Mar 14 14:24:00 CST 2006


Travis Oliphant wrote:

>Mark W. wrote:
>  
>
>>Hi. We are converting our systems to a 64-bit platform to hopefully take 
>>advantage of larger address spaces for arrays and such. Can anyone tell me - 
>>or point me to documentation which tells - how much address space for an 
>>array I could hope to get? We have a memory error on the 32-bit machines 
>>when we try to load a large array and we're hoping this will get around that 
>>2 Gig (or less) limit.
>>  
>>    
>>
>This is finally possible using Python 2.5 and numpy.  But, you need to 
>use Python 2.5 which is only available as an SVN check-out and still has 
>a few issues.  Python 2.5 should be available as a release in the summer.
>
>NumPy allows creation of larger arrays even with Python 2.4 but there 
>will be some errors in some uses of slicing, the buffer interface, and 
>memory-mapped arrays because of inherent limitations to Python that were 
>only recently removed.
>  
>
While what Travis says is correct, even with older Pythons (such as 2.3) 
you can have processes with > 2GB memory, even if any individual array 
doesn't go into the realm Travis mentions. This was a major reason for 
me to move to a 64 bit machine. (Loading a few 1 GB arrays.) My amd64 is 
working quite well in full 64-bit mode with debian sarge's default 
python2.3 and the latest numpy, scipy, etc. and I can easily have 
individual processes > 2GB. Also, I've found this page helpful for 
information about this stuff in linux: 
http://www.spack.org/wiki/LinuxRamLimits

Cheers!
Andrew



More information about the SciPy-user mailing list