[Numpy-discussion] Efficient way to load a 1Gb file?
Sun Aug 14 10:31:24 CDT 2011
Try the fromiter function, that will allow you to pass an iterator
which can read the file line by line and not preload the whole file.
file_iterator = iter(open('filename.txt')
line_parser = lambda x: map(float,x.split('\t'))
You have also the option to iterate the file twice and pass the
On Wed, Aug 10, 2011 at 7:22 PM, Russell E. Owen <firstname.lastname@example.org> wrote:
> A coworker is trying to load a 1Gb text data file into a numpy array
> using numpy.loadtxt, but he says it is using up all of his machine's 6Gb
> of RAM. Is there a more efficient way to read such text data files?
> -- Russell
> NumPy-Discussion mailing list
More information about the NumPy-Discussion