[Numpy-discussion] Assigning complex value to real array
Andrew P. Mullhaupt
Thu Oct 7 18:46:14 CDT 2010
On 10/7/2010 5:14 PM, Pauli Virtanen wrote:
> to, 2010-10-07 kello 15:38 -0400, Andrew P. Mullhaupt kirjoitti:
>> On 10/7/2010 1:01 PM, Pauli Virtanen wrote:
>>> to, 2010-10-07 kello 12:08 -0400, Andrew P. Mullhaupt kirjoitti:
>>> But to implement this, you'd have to rewrite large parts of Numpy since
>>> the separated storage of re/im conflicts with its memory model.
>> You wouldn't want to rewrite any of Numpy, just add a new class.
> I believe you didn't really think about this. Because of the memory
> model mismatch, you will need to rewrite all arithmetic operations,
> special functions, and any operations that select, choose, sort, etc.
> parts of arrays.
It wouldn't be the first time I suggested rewriting the select and
choose operations. I spent months trying to get Guido to let anything
more than slice indexing in arrays. And now, in the technologically
advanced future, we can index a numpy array with a list, not just
slices. I'm not claiming any credit for that though - I don't know who
actually got that ball over the finish line.
But it's also not as bad as you think if you do it right. In the first
place, you don't need to rewrite any of the real operations other than
to check if the complex part is trivial or not (which as I pointed out a
minute ago can be done on a variable granular or page granular level for
variables that take up more than a page.
So it's only the case with a nontrivial complex part where you have to
do anything at all. OK so just offset the real address by the different
to the imaginary part. Doing this on a variable-level granularity would
allow you to simply replicate all the selection and sorting addressing.
The interesting case is noncontiguous data arrays. Here, you can
actually get a benefit, since if all the floating values are in this
class, you already know the address of the memory where you will widen
them - so if you have to widen some horrific recursive spaghetti
structure, then you get a big win here because you don't have to worry
about copying the real parts.
But let's go back to what I asked: Could we have a data type that would
be able to widen itself?
It appears that your answer is YES, we COULD have that, BUT it would be
a lot of work.
More information about the NumPy-Discussion