[Numpy-discussion] Out-of-RAM FFTs
Wed Apr 1 12:19:52 CDT 2009
In any case, the OS will have to swap a lot of your data :
- if you use floats (32bits), you use 4GB for your input array
- this does not fit inside your memory
- it even fit less if you count on the fact that FFT needs a least one
array as large, so at least 8 GB.
So you should, in every case, split your data and load it on the fly,
by hand (after each FFT, you could swap the array axis, but it may not
be the best, the best being having something like 16GB RAM for
2009/4/1 Greg Novak <email@example.com>:
> I'd like to do an FFT of a moderately large 3D cube, 1024^3. Looking
> at the run-time of smaller arrays, this is not a problem in terms of
> compute time, but the array doesn't fit in memory. So, several
> 1) Numerical Recipes has an out-of-memory FFT algorithm, but looking
> through the numpy and scipy docs and modules, I didn't find a function
> that does the same thing. Did I miss it? Should I get to work typing
> it in?
> 2) I had high hopes for just memory-mapping the large array and
> passing it to the standard fft function. However, the memory-mapped
> region must fit into the address space, and I don't seem to be able to
> use more than 2 GB at a time. So memory mapping doesn't seem to help
> me at all.
> This last issue leads to another series of things that puzzle me. I
> have an iMac running OS X 10.5 with an Intel Core 2 duo processor and
> 4 GB of memory. As far as I've learned, the processor is 64 bit, the
> operating system is 64 bit, so I should be able to happily memory-map
> my entire disk if I want. However, Python seems to run out of steam
> when it's used 2 GB. This is true of both 2.5 and 2.6. What gives?
> Is this a Python issue?
> Numpy-discussion mailing list
Information System Engineer, Ph.D.
Blogs: http://matt.eifelle.com and http://blog.developpez.com/?blog=92
More information about the Numpy-discussion