[Numpy-discussion] Loading a > GB file into array
Fri Nov 30 13:51:23 CST 2007
> Well, one thing you could do is dump your data into a PyTables_
> ``CArray`` dataset, which you may afterwards access as if its was a
> NumPy array to get slices which are actually NumPy arrays. PyTables
> datasets have no problem in working with datasets exceeding memory size.
> For instance::
I've recently started using PyTables for storing large datasets and I'd
give it 10/10! Access is fast enough you can just access the data you
need and leave the full array on disk.
More information about the Numpy-discussion