[Numpy-discussion] Loading a > GB file into array
Thu Dec 20 17:11:07 CST 2007
>> By the way, I installed 64-bit linux (ubuntu 7.10) on the same machine,
>> and now numpy.memmap works like a charm. Slicing around a 15 GB file is fun!
> Thanks for the feedback !
> Did you get the kind of speed you need and/or the speed you were hoping for ?
Nope. Like I wrote earlier, it seems there isn't time for disk access in
my main loop, which is what memmap is all about. I resolved this by
loading the whole file into memory as a python list of 2D arrays,
instead of one huge contiguous 3D array. That got me an extra 100 to 200
MB of physical memory to work with (about 1.4GB out of 2GB total) on
win32, which is all I needed.
More information about the Numpy-discussion