[Numpy-discussion] Assigning complex values to a real array
Wed Dec 9 03:04:54 CST 2009
ti, 2009-12-08 kello 22:26 -0800, Dr. Phillip M. Feldman kirjoitti:
> Darren Dale wrote:
> > On Sat, Mar 7, 2009 at 5:18 AM, Robert Kern <email@example.com> wrote:
> >> On Sat, Mar 7, 2009 at 04:10, Stéfan van der Walt <firstname.lastname@example.org>
> >> wrote:
> >> > 2009/3/7 Robert Kern <email@example.com>:
> >> >> In : z = zeros(3, int)
> >> >>
> >> >> In : z = 1.5
> >> >>
> >> >> In : z
> >> >> Out: array([0, 1, 0])
> >> >
> >> > Blind moment, sorry. So, what is your take -- should this kind of
> >> > thing pass silently?
> >> Downcasting data is a necessary operation sometimes. We explicitly
> >> made a choice a long time ago to allow this.
I'd think that downcasting is different from dropping the imaginary
part. Also, I doubt a bit that there is a large body of correct code
relying on the implicit behavior. This kind of assertions should of
course be checked experimentally -- make the complex downcast an error,
and check a few prominent software packages.
An alternative to an exception would be to make complex numbers with
nonzero imaginary parts to cast to *nan*. This would, however, likely
lead to errors difficult to track.
Another alternative would be to raise an error only if the imaginary
part is non-zero. This requires some additional checking in some places
where no checking is usually made.
At least I tend to use .real or real() to explicitly take the real part.
In interactive use, it occasionally is convenient to have the real part
taken "automatically", but sometimes this leads to problems inside
Nevertheless, I can't really regard dropping the imaginary part a
significant issue. I've sometimes bumped into problems because of it,
and it would have been nice to catch them earlier, though. (As an
example, scipy.interpolate.interp1d some time ago silently dropped the
imaginary part -- not nice.)
More information about the NumPy-Discussion