[Numpy-discussion] Nasty bug using pre-initialized arrays

Stuart Brorson sdb@cloud9....
Fri Jan 4 18:45:19 CST 2008

>>>> I realize NumPy != Matlab, but I'd wager that most users would
>>>> think that this is the natural behavior......
>>> Well, that behavior won't happen. We won't mutate the dtype of the
>>> array because of assignment. Matlab has copy(-on-write) semantics
>>> for things like slices while we have view semantics. We can't
>>> safely do the reallocation of memory [1].
>> That's fair enough.  But then I think NumPy should consistently
>> typecheck all assignmetns and throw an exception if the user attempts
>> an assignment which looses information.
> There is a long history in numeric/numarray/numpy about this "feature".
> And for many of us, it really is a feature -- it prevents the automatic
> upcasting of arrays, which is very important if your arrays are huge
> (i.e. comparable in size to your system memory).

That's well and good.  But NumPy should *never* automatically -- and
silently -- chop the imaginary part off your complex array elements,
particularly if you are just doing an innocent assignment!
Doing something drastic like silently throwing half your data away can
lead to all kinds of bugs in code written by somebody who is unaware
of this behavior (i.e. most people)!

It sounds to me like the right thing is to throw an exception instead
of "downcasting" a data object.


More information about the Numpy-discussion mailing list