[NumPy-Tickets] [NumPy] #1522: segfault in any() on large object array

NumPy Trac numpy-tickets@scipy....
Mon Jan 3 18:20:27 CST 2011


#1522: segfault in any() on large object array
------------------------+---------------------------------------------------
 Reporter:  glub        |       Owner:  somebody
     Type:  defect      |      Status:  new     
 Priority:  high        |   Milestone:  2.0.0   
Component:  numpy.core  |     Version:  devel   
 Keywords:              |  
------------------------+---------------------------------------------------

Comment(by jpeel):

 I've found the problem. The beginning of the process for when the buffer
 is used (# of objects >= 10000) is as follows. The first object of the
 array is copied and then cast as a Bool. However, if either the input
 array contains objects or the output will be of type object, then
 loop->obj was set to 1 in construct_reduce() and that signals that the
 cast object (in this case to a Bool) should be INCREFed. However, since
 the object is a Bool rather than a PyObject, a segfault occurs. This same
 problem doesn't happen with smaller arrays because in that case the first
 object is copied without being cast and then INCREFed. Since the object is
 a PyObject, there isn't a problem.

 The patch that I submitted just removes the lines that INCREF when
 loop->obj is set and the array is large enough to trigger buffering. The
 only potential problem with doing this is if a ufunc generates an object
 as output, but I don't see that really as a possibility. Does anyone have
 any problem with this fix? The alternative is to make different loop->obj
 flags: one for when the input is an object and one for when the output is
 an object.

-- 
Ticket URL: <http://projects.scipy.org/numpy/ticket/1522#comment:6>
NumPy <http://projects.scipy.org/numpy>
My example project


More information about the NumPy-Tickets mailing list