[Numpy-discussion] numpy.any segfaults for large object arrays

Bruce Southey bsouthey@gmail....
Mon Mar 24 12:59:32 CDT 2008


Hi,
This also crashes by numpy 1.0.4 under python 2.5.1. I am guessing it
may be due to numpy.any() probably not understanding the 'None' .

Bruce

Martin Manns wrote:
>> On 24 Mar 2008, at 14:05, Martin Manns wrote:
>>
>>     
>>> Hello,
>>>
>>> I am encountering a problem (a bug?) with the numpy any function.
>>> Since the python any function behaves in a slightly different way,
>>> I would like to keep using numpy's.
>>>
>>>       
>> I cannot confirm the problem on my intel macbook pro using the same  
>> Python and Numpy versions. Although any(numpy.array(large_none)) takes  
>> a significantly longer time than any(numpy.array(large_zero)), the  
>> former does not segfault on my machine.
>>     
>
> I tested it on a Debian box (again Numpy 1.0.4) and was able to reproduce the problem:
>
> ~$ python
> Python 2.4.5 (#2, Mar 12 2008, 00:15:51) 
> [GCC 4.2.3 (Debian 4.2.3-2)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
>   
>>>> import numpy
>>>> numpy.version.version
>>>>         
> '1.0.4'
>   
>>>> large_none = [None] * 1000000
>>>> numpy.any(numpy.array(large_none))
>>>>         
> Segmentation fault
> ~$ python2.5
> Python 2.5.2a0 (r251:54863, Feb 10 2008, 01:31:28) 
> [GCC 4.2.3 (Debian 4.2.3-1)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
>   
>>>> import numpy
>>>> large_none = [None] * 1000000
>>>> numpy.any(numpy.array(large_none))
>>>>         
> Segmentation fault
>
>   



More information about the Numpy-discussion mailing list