[SciPy-user] Efficient file reading

Cohen-Tanugi Johann cohen@lpta.in2p3...
Tue Jan 6 16:22:43 CST 2009


numpy.loadtxt?
Johann

Lorenzo Isella wrote:
> Dear All,
> I sometimes need to read rather large data files (~500Mb).
> These are plain text files (usually tables with 500 x 2e5 entries).
> It seems to me (but I have not done any serious test/benchmark) that R 
> is faster than Python to read/write files.
> Or better: maybe I am too naive when doing I/O operation in python.
> I usually simply do the following
>
> import pylab as p
>
> my_arr=p.load("my_data.txt")
>
> which gets the job done, but is slow in this case. Probably there is a 
> more efficient way for doing this, and I should also add that I know 
> beforehand the dimensions of the data table I want to read into a scipy 
> array.
> Any suggestions?
> Many thanks
>
> Lorenzo
> _______________________________________________
> SciPy-user mailing list
> SciPy-user@scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-user
>   


More information about the SciPy-user mailing list