[SciPy-user] read/write compressed files
Wed Jun 20 15:18:26 CDT 2007
I got it (partially) working, but am not sure about optimality. In
particular, will fromstring copy memory into the array or decompress in
place? I think the former (how else would it know the size, and tell()
will be slow), but please correct me if I am wrong.
fh = gzip.GzipFile("test.dat.gz", 'rb');
#ps = zeros(256*256) - will it help?
ps = fromstring(fh.read(), 'd')
ps.shape = (256,256)
fp = open('test.dat', 'wb')
io.numpyio.fwrite(fp, ps.size, ps)
Dominik Szczerba wrote:
> That works very well for ascii files, but I failed to figure out about
> binary data...
> Thanks for any hints,
> - Dominik
> Anne Archibald wrote:
>> On 20/06/07, Dominik Szczerba <firstname.lastname@example.org> wrote:
>>> Yes, I know it, but it does not return a scipy array, does it?
>>> Can I achieve it without copying memory? (I have huge arrays to process)
>> If the bz2 module will provide a file-like object, scipy.read_array
>> can read from that.
>> SciPy-user mailing list
Dominik Szczerba, Ph.D.
Computer Vision Lab CH-8092 Zurich
More information about the SciPy-user