[Numpy-discussion] large memory address space on Mac OS X (intel)

Robert Kern robert.kern@gmail....
Thu Feb 1 13:48:08 CST 2007


Travis Oliphant wrote:
> Louis Wicker wrote:
> 
>> Dear list:
>>
>> I cannot seem to figure how to create arrays > 2 GB on a Mac Pro 
>> (using Intel chip and Tiger, 4.8).  I have hand compiled both Python 
>> 2.5 and numpy 1.0.1, and cannot make arrays bigger than 2 GB.  I also 
>> run out of space if I try and 3-6 several arrays of 1000 mb or so (the 
>> mem-alloc failure does not seem consistent, depends on whether I am 
>> creating them with a "numpy.ones()" call, or creating them on the fly 
>> by doing math with the other arrays "e.g., c  = 4.3*a + 3.1*b").
>>
>> Is this a numpy issue, or a Python 2.5 issue for the Mac?  I have 
>> tried this on the SGI Altix, and this works fine.
> 
> It must be a malloc issue.  NumPy uses the system malloc to construct 
> arrays.  It just reports errors back to you if it can't.
> 
> I don't think the Mac Pro uses a 64-bit chip, does it?

Intel Core 2 Duo's are 64-bit, but the OS and runtime libraries are not. None of
the Python distributions for OS X are compiled for 64-bit support, either.

When Leopard comes out, there will be better 64-bit support in the OS, and
Python will follow. Until then, Python on OS X is 32-bit.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco


More information about the Numpy-discussion mailing list