[Numpy-discussion] Memory usage of numpy-arrays

Hannes Bretschneider hannes.bretschneider@wiwi.hu-berlin...
Thu Jul 8 08:26:03 CDT 2010

Dear NumPy developers,

I have to process some big data files with high-frequency
financial data. I am trying to load a delimited text file having
~700 MB with ~ 10 million lines using numpy.genfromtxt(). The
machine is a Debian Lenny server 32bit with 3GB of memory.  Since
the file is just 700MB I am naively assuming that it should fit
into memory in whole. However, when I attempt to load it, python
fills the entire available memory and then fails with

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python2.6/site-packages/numpy/lib/io.py", line 1318, in genfromtxt
    errmsg = "\n".join(errmsg)

Is there a way to load this file without crashing?

Thanks, Hannes

More information about the NumPy-Discussion mailing list