[NumPy-Tickets] [NumPy] #2080: Issue with memory mapping large files

NumPy Trac numpy-tickets@scipy....
Tue Mar 13 10:32:53 CDT 2012


#2080: Issue with memory mapping large files
------------------------+---------------------------------------------------
 Reporter:  for_usenet  |       Owner:  somebody   
     Type:  defect      |      Status:  new        
 Priority:  normal      |   Milestone:  Unscheduled
Component:  numpy.core  |     Version:  1.6.1      
 Keywords:  mmap        |  
------------------------+---------------------------------------------------
 I've been using Numpy to read, organize and process data files.

 The size of these files are quite variable, but they have a fixed size
 header, after which, the data payload can range anywhere from a few MB to
 a few GB in size.

 I've been able to read and process smaller files just fine, but when I
 tried to read a 12 GB file, I get the following error:

   File "/Library/Python/2.6/site-
 packages/numpy-1.6.1-py2.6-macosx-10.6-universal.egg/numpy/core/memmap.py",
 line 237, in __new__
     mm = mmap.mmap(fid.fileno(), bytes, access=acc, offset=start)
 ValueError: mmap length is greater than file size

 The error occurs for Numpy 1.6.1 on Mac OS X 10.6.8 64-bit with Python
 2.6, and on openSUSE 11.4 64-bit with python 2.7.

 The snippet of code that gives the above error is:

         self.rawDataHeader.seek (0, 0)
         self.rawData4Recon = np.memmap (self.rawDataHeader, mode = 'r',
                                         dtype = np.int16,
                                         offset = self.rawDataOffset(),
                                         shape = self.rawDataShape())

 Please let me know if there's anything I can do to help track down or
 debug this error.

-- 
Ticket URL: <http://projects.scipy.org/numpy/ticket/2080>
NumPy <http://projects.scipy.org/numpy>
My example project


More information about the NumPy-Tickets mailing list