[Numpy-discussion] Unhelpful errors trying to create very large arrays?
Charles R Harris
charlesr.harris@gmail....
Sun Mar 22 01:03:54 CDT 2009
On Sat, Mar 21, 2009 at 11:46 PM, Matthew Brett <matthew.brett@gmail.com>wrote:
> Hello,
>
> I found this a little confusing:
>
> In [11]: n = 2500000000
>
> In [12]: np.arange(n).shape
> Out[12]: (0,)
>
> Maybe this should raise an error instead.
>
> This was a little more obvious, but perhaps again a more explicit
> error would be helpful?
>
> In [13]: np.zeros((n,))
> ---------------------------------------------------------------------------
> OverflowError Traceback (most recent call last)
>
> /home/mb312/tmp/max_speed.py in <module>()
> ----> 1
> 2
> 3
> 4
> 5
>
> OverflowError: long int too large to convert to int
>
Open a ticket. For testing purposes, such large integers are easier to parse
if they are written as products, i.e., something like n = 25*10**8. That is
about 10 GB for an integer array. How much memory does your machine have?
Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090322/01716ef7/attachment.html
More information about the Numpy-discussion
mailing list