[SciPy-user] Efficient file reading

Lorenzo Isella lorenzo.isella@gmail....
Tue Jan 6 16:09:35 CST 2009

Dear All,
I sometimes need to read rather large data files (~500Mb).
These are plain text files (usually tables with 500 x 2e5 entries).
It seems to me (but I have not done any serious test/benchmark) that R 
is faster than Python to read/write files.
Or better: maybe I am too naive when doing I/O operation in python.
I usually simply do the following

import pylab as p


which gets the job done, but is slow in this case. Probably there is a 
more efficient way for doing this, and I should also add that I know 
beforehand the dimensions of the data table I want to read into a scipy 
Any suggestions?
Many thanks


More information about the SciPy-user mailing list