[Numpy-discussion] [Python-3000] PEP 31XX: A Type Hierarchy for Numbers (and other algebraic entities)
Sun Apr 29 22:52:32 CDT 2007
On 4/29/07, Guido van Rossum <firstname.lastname@example.org> wrote:
> On 4/29/07, Jim Jewett <email@example.com> wrote:
> > On 4/29/07, Guido van Rossum <firstname.lastname@example.org> wrote:
> > > Hmm... Maybe the conclusion to draw from this is that we shouldn't
> > > make Ring a class? Maybe it ought to be a metaclass, so we could ask
> > > isinstance(Complex, Ring)?
> > Yes; all the ABCs are assertions about the class.
> I don't think so. Many are quite useful for introspection of instances
> as well, e.g. Hashable/Iterable (the whole "One Trick Ponies" section)
> as well as the distinction between Sequence and Mapping. It's the
> binary operations where the class comes into play.
I think those are one of the reasons it seemed like we should use
inheritance. Treating them as assertions about each instance makes
sense -- but it makes just as much sense to treat them as assertions
about the class. (Is it meaningful to ask about the size of this?
Well, for objects of this class, it is..)
> > The only thing two subclasses of an *Abstract* class need to have in
> > common is that they both (independently) meet the requirements of the
> > ABC. If not for complexity of implementation, that would be better
> > described as a common metaclass.
> Again, not so fast; it depends. The way the Set section of the PEP is
> currently written, all sets are comparable (in the subset/superset
> sense) to all other sets, and for ComposableSet instances the union,
> intersection and both types of differences are also computable across
> class boundaries.
That is an additional constraint which the Set metaclass imposes --
and it does this by effectively coercing instances of both Set classes
to (buitin) frozenset instances to create the return value. (It
doesn't actually create the intermediate frozenset instance, but
mySet() | mySet() will return a frozenset rather than a MySet.
> > Using a metaclass would also solve the "when to gripe" issue; the
> > metaclass would gripe if it couldn't make every method concrete. If
> > this just used the standard metaclass machinery, then it would mean a
> > much deeper metaclass hierarchy than we're used to; MutableSet would a
> > have highly dervived metaclass.
> I think you're going way too fast here.
To be more precise, classes implementing MutabSet would have
MutableSet as a metaclass, which would mean (through inheritance) than
they would also have ComposableSet, Set, Sized, Iterable, and
PartiallyOrdered as metaclasses.
This is legal today, but *I* haven't seen code in the wild with a
metaclass hierarchy that deep.
> > > The more I think about it, it sounds like the right thing to do. To
> > > take PartiallyOrdered (let's say PO for brevity) as an example, the
> > > Set class should specify PO as a metaclass. The PO metaclass could
> > > require that the class implement __lt__ and __le__. If it found a
> > > class that didn't implement them, it could make the class abstract by
> > > adding the missing methods to its __abstractmethods__ attribute.
> > Or by making it a sub(meta)class, instead of a (regular instance) class.
> That makes no sense. Deciding on the fly whether something should be a
> class or a metaclass sounds like a fine recipe for end-user confusion.
Perhaps. But how is that different from deciding on the fly whether
the class will be instantiable? Either way, the user does need to
keep track of whether it is abstract; the difference is that when
inheriting, they would have to say
(metaclass=ABC) # I know it is an abstract class
(ABC) # Might be a regular class, but I can never instantiate it directly
> I was thinking of other representations of complex numbers as found
> e.g. in numpy. These vary mostly by using fewer (or more?) bits for
> the real and imag parts. They can't realistically subclass complex, as
> their implementation is independent; they should subclass Complex, to
> indicate that they implement the Complex API. I really think you're
> going too far with the metaclass idea.
Or they could have Compex as a metaclass, which would serve the same
> > > I expect that the complex subclasses used in practice are
> > > all happy under mixed arithmetic using the usual definition of mixed
> > > arithmetic: convert both arguments to a common base class and compute
> > > the operation in that domain.
> > It is reasonable to insist that all Complex classes have a way to
> > tranform their instances into (builtin) complex instances, if only as
> > a final fallback. There is no need for complex to be a base class.
> I agree complex shouldn't be a base class (apologies if I implied that
> by using lowercase) but I still think Complex should be a base class.
What do you get by inheriting from it, that you couldn't get by
letting it inject any missing methods?
> take the first opportunity). Perhaps we need to extend the built-in
> operation processing so that if both sides return NotImplemented,
> before raising TypeError, we look for some common base type
> implementing the same operation. The abstract Complex type could
> provide abstract implementations of the binary operators that would
> convert their arguments to the concrete complex type and compute the
> result that way. (Hah! Another use case for abstract methods with a
> useful implementation! :-)
That seems sensible -- but you could also do just do it in the _r* method,
since the regular method already returned NotImplemented.
So Complex could inject a __radd__ method that tried
self.__add__(complex(other)) # since addition is commutative for Complex
If the method weren't in either class, I would expect it to be a more
general fallback, like "if we don't have __lt__, try __cmp__"
More information about the Numpy-discussion