[Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

Robert Kern robert.kern@gmail....
Mon Feb 26 13:43:09 CST 2007

Zachary Pincus wrote:

> To address the former, I'd like to be able to (say) include something  
> like 'config_endian --big' on the 'python setup.py' command-line, and  
> have that information trickle down to the PIL config script (a few  
> subpackages deep). Is this easy or possible?

I'd just do separate builds on PPC and Intel machines.

> To address the latter, I think I need to have the PIL extensions  
> dynamically link against '/Developer/SDKs/MacOSX10.4u.sdk/usr/lib/ 
> libz.dylib' which is the fat-binary version of the library, using the  
> headers from '/Developer/SDKs/MacOSX10.4u.sdk/usr/include/zlib.h
> '. Right now, PIL is using system_info from numpy.distutils to find  
> the valid library paths on which libz and its headers might live.  
> This is nice and more or less platform-neutral, which I like. How  
> best should I convince/configure numpy.distutils.system_info to put '/ 
> Developer/SDKs/MacOSX10.4u.sdk/usr/{lib,include}' on the output to  
> get_include_dirs() and get_lib_dirs()?

distutils ought to be including an argument like this:

  -isysroot /Developer/SDKs/MacOSX10.4u.sdk

That ought to be sufficient for building PIL. Don't hack up PIL's build process
if at all possible.

Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco

More information about the Numpy-discussion mailing list