[Numpy-discussion] numpy.any segfaults for large object arrays

Bruce Southey bsouthey@gmail....
Mon Mar 24 15:00:55 CDT 2008


Hi,
True, I noticed that on my system (with 8 Gb memory) that using 9999 
works but not 10000.
Also, use of a 2 dimensional array also crashes if the size if large enough:
large_m=numpy.vstack((large_none, large_none))

Bruce


Martin Manns wrote:
> Bruce Southey <bsouthey@gmail.com> wrote:> Hi,
>   
>> This also crashes by numpy 1.0.4 under python 2.5.1. I am guessing it
>> may be due to numpy.any() probably not understanding the 'None' .
>>     
>
> I doubt that because I get the segfault for all kinds of object arrays that I try out:
>
> ~$ python
> Python 2.4.5 (#2, Mar 12 2008, 00:15:51) 
> [GCC 4.2.3 (Debian 4.2.3-2)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
>   
>>>> import numpy
>>>> small_obj = numpy.array([1]*10**3, dtype="O")
>>>> numpy.any(small_obj)
>>>>         
> True
>   
>>>> large_obj = numpy.array([1]*10**6, dtype="O")
>>>> numpy.any(large_obj)
>>>>         
> Segmentation fault
> ~$ python
>   
>>>> import numpy
>>>> large_strobj = numpy.array(["Yet another string."]*10**6, dtype="O")
>>>> numpy.any(large_strobj)
>>>>         
> Segmentation fault
>
> Martin
>
>
>   



More information about the Numpy-discussion mailing list