[Numpy-discussion] Out-of-RAM FFTs
Charles R Harris
Wed Apr 1 12:06:01 CDT 2009
On Wed, Apr 1, 2009 at 9:26 AM, David Cournapeau <
> Greg Novak wrote:
> > 1) Numerical Recipes has an out-of-memory FFT algorithm, but looking
> > through the numpy and scipy docs and modules, I didn't find a function
> > that does the same thing. Did I miss it?
> I don't think so.
> > Should I get to work typing
> > it in?
> Maybe :)
> > 2) I had high hopes for just memory-mapping the large array and
> > passing it to the standard fft function. However, the memory-mapped
> > region must fit into the address space, and I don't seem to be able to
> > use more than 2 GB at a time. So memory mapping doesn't seem to help
> > me at all.
> > This last issue leads to another series of things that puzzle me. I
> > have an iMac running OS X 10.5 with an Intel Core 2 duo processor and
> > 4 GB of memory. As far as I've learned, the processor is 64 bit, the
> > operating system is 64 bit, so I should be able to happily memory-map
> > my entire disk if I want. However, Python seems to run out of steam
> > when it's used 2 GB. This is true of both 2.5 and 2.6. What gives?
> > Is this a Python issue?
> Yes - official python binaries are 32 bits only. I don't know how
> advanced/usable is the 64 bits build, but I am afraid you will have to
> use an unofficial build or to build it by yourself.
> I don't know if the following can help you:
There was a thread about this back when...
is, note Michael Abshoff's directions on building 64 bit python on the
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Numpy-discussion