[Numpy-discussion] saving incrementally numpy arrays

Juan Fiol fiolj@yahoo....
Wed Aug 12 18:11:26 CDT 2009


Hi, I finally decided by the pytables approach because will be easier later to work with the data. Now, I know is not the right place but may be I can get some quick pointers. I've calculated a numpy array of about 20 columns and a few thousands rows at each time. I'd like to append all the rows without iterating over the numpy array. Someone knows what would be the "right" approach? I am looking for something simple, I do not need to keep the piece of table after I put into the h5file. Thanks in advance and regards, Juan

--- On Tue, 8/11/09, Citi, Luca <lciti@essex.ac.uk> wrote:

> From: Citi, Luca <lciti@essex.ac.uk>
> Subject: Re: [Numpy-discussion] saving incrementally numpy arrays
> To: "Discussion of Numerical Python" <numpy-discussion@scipy.org>
> Date: Tuesday, August 11, 2009, 9:26 PM
> You can do something a bit tricky but
> possibly working.
> I made the assumption of a C-ordered 1d vector.
> 
> 
> 
> import numpy as np
> import numpy.lib.format as fmt
> 
> # example of chunks
> chunks = [np.arange(l) for l in range(5,10)]
> 
> # at the beginning
> fp = open('myfile.npy', 'wb')
> d = dict(
>            
> descr=fmt.dtype_to_descr(chunks[0].dtype),
>            
> fortran_order=False,
>             shape=(2**30), #
> some big shape you think you'll never reach
>         )
> fp.write(fmt.magic(1,0))
> fmt.write_array_header_1_0(fp, d)
> h_len = fp.tell()
> l = 0
> # ... for each chunk ...
> for chunk in chunks:
>     l += len(chunk)
>     fp.write(chunk.tostring('C'))
> # finally
> fp.seek(0,0)
> fp.write(fmt.magic(1,0))
> d['shape'] = (l,)
> fmt.write_array_header_1_0(fp, d)
> fp.write(' ' * (h_len - fp.tell() - 1))
> fp.close()
> 
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
> 


      


More information about the NumPy-Discussion mailing list