[SciPy-dev] Generic polynomials class (was Re: Volunteer for Scipy Project)
Sat Oct 17 09:01:34 CDT 2009
On Sat, Oct 17, 2009 at 3:26 AM, Fernando Perez <firstname.lastname@example.org> wrote:
> On Fri, Oct 16, 2009 at 12:58 PM, Anne Archibald
> <email@example.com> wrote:
>> If we're implementing the heavy lifting in a procedural interface,
>> then the object implementations will just be plumbing (as I see your
>> analyzer objects are). So on the one hand there's not much code to be
>> shared, but on the other it's all boilerplate that would benefit from
>> code sharing.
> Yes, in our case it was a deliberate design decision: we were in a
> sense hedging our bets. Since it's so easy to go down the wrong
> design path with complex objects, we basically punted and made the
> user-visible objects trivially simple from an
> algorithmic/computational perspective. This puts all the 'smarts' in
> more cumbersome procedural interfaces, but it also means that the same
> procedural foundation can support more than one OO design. So if
> either we get our design horribly wrong, or it's simply not a good fit
> for someone, there's no major loss, as they can just build their own
> interface on top of the same machinery, and very little code is lost.
> I'm not sure this idea works really well in the long run, but we got
> there after being bitten (many times) by complex OO codes that end up
> 'trapping' a lot of procedural smarts within, and thus condemning them
> to be lost wherever the OO design doesn't work well. We thus tried to
> separate the two concerns a little bit.
Just to point out that OO design and "framework independence" are
not necessarily exclusive:
In statsmodels, we still use classes and subclasses to benefit from
inheritance and OO, but the interface is in plain numpy arrays.
This way, any data framework or data structure (nitime time series,
scikits.timeseries, pandas, tabular, ...) or formula framework
can use the classes, but has to write the conversion code from
a data frame object to the ndarray design matrix and back again
However, we are still not sure what or where the boundaries should
be, and we still have methods like statistical tests, that
would be useful as standalone functions so they can be reused
without requiring that the estimation is done by the statsmodels
In the nitime case, whether ``algorithms`` uses classes wouldn't
really matter for the easy usage from outside of neuroimaging as
long as it doesn't force the user of the algorithms to use the
nitime timeseries class as the Analyzers in timeseries do.
> More experience will tell us whether the approach does work well, and
> what its limitations in practice turn out to be.
> Scipy-dev mailing list
More information about the Scipy-dev