[Numpy-discussion] reading big-endian uint16 into array on little-endian machine

Peter numpy-discussion@maubp.freeserve.co...
Thu Jun 17 11:11:18 CDT 2010


On Thu, Jun 17, 2010 at 3:29 PM, greg whittier <gregwh@gmail.com> wrote:
>
> I have files (from an external source) that contain ~10 GB of
> big-endian uint16's that I need to read into a series of arrays.  What
> I'm doing now is
>
> import numpy as np
> import struct
>
> fd = open('file.raw', 'rb')
>
> for n in range(10000)
>    count = 1024*1024
>    a = np.array([struct.unpack('>H', fd.read(2)) for i in range(count)])
>    # do something with a
>
> It doesn't seem very efficient to call struct.unpack one element at a
> time, but struct doesn't have an unpack_farray version like xdrlib
> does.  I also thought of using the array module and .byteswap() but
> the help says it only work on 4 and 8 byte arrays.

I'm unclear if you want a numpy array or a standard library array,
but can you exploit the fact that struct.unpack returns a tuple? e.g.

struct.unpack(">%iH" % count, fd.read(2*count))

Peter


More information about the NumPy-Discussion mailing list