[Numpy-discussion] reading big-endian uint16 into array on little-endian machine

Robert Kern robert.kern@gmail....
Thu Jun 17 09:58:03 CDT 2010


On Thu, Jun 17, 2010 at 09:46, Francesc Alted <faltet@pytables.org> wrote:
> A Thursday 17 June 2010 16:29:54 greg whittier escrigué:
>> I have files (from an external source) that contain ~10 GB of
>> big-endian uint16's that I need to read into a series of arrays.  What
>> I'm doing now is
>>
>> import numpy as np
>> import struct
>>
>> fd = open('file.raw', 'rb')
>>
>> for n in range(10000)
>>     count = 1024*1024
>>     a = np.array([struct.unpack('>H', fd.read(2)) for i in range(count)])
>>     # do something with a
>>
>> It doesn't seem very efficient to call struct.unpack one element at a
>> time, but struct doesn't have an unpack_farray version like xdrlib
>> does.  I also thought of using the array module and .byteswap() but
>> the help says it only work on 4 and 8 byte arrays.
>
> Maybe is a problem with docs.

I think he was talking about the standard library array module.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
  -- Umberto Eco


More information about the NumPy-Discussion mailing list