[SciPy-dev] Test Design Question for Stats.Models Code

Skipper Seabold jsseabold@gmail....
Fri Jul 17 13:40:38 CDT 2009


On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey<bsouthey@gmail.com> wrote:
> On 07/17/2009 12:26 PM, Robert Kern wrote:
>
> On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
>
>
> Hello all,
>
> I am polishing up the generalized linear models right now for the
> stats.models project and I have a question about using decorators with
> my tests.  The GLM framework has a central model with shared
> properties and then several variations on this model, so to test I
> have just as a simplified example:
>
> from numpy.testing import *
> DECIMAL = 4
> class check_glm(object):
>    '''
>    res2 results will be obtained from R or the RModelwrap
>    '''
>
>    def test_params(self):
>        assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
>
>    def test_resids(self):
>        assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
>
> class test_glm_gamma(check_glm):
>    def __init__(self):
> # Preprocessing to setup results
>        self.res1 = ResultsFromGLM
>        self.res2 = R_Results
>
> if __name__=="__main__":
>    run_module_suite()
>
> My question is whether I can skip, for arguments sake, test_resids
> depending, for example, on the class of self.res2 or because I defined
> the test condition as True in the test_<> class.   I tried putting in
> the check_glm class
>
> @dec.skipif(TestCondition, "Skipping this test because of ...")
> def test_resids(self):
> ...
>
> TestCondition should be None by default, but how can I get the value
> of TestCondition to evaluate to True if appropriate?  I have tried a
> few different ways, but I am a little stumped. Does this make sense/is
> it possible?  I'm sure I'm missing something obvious, but any insights
> would be appreciated.
>
>
> I don't think you can do that with the decorator. Just do the test in
> code inside the method and raise nose.SkipTest explicitly:
>
> def test_resids(self):
>     if not isinstance(self.res1, GLMResults):
>         raise nose.SkipTest("Not a GLM test")
>

Thanks for the suggestion.  This definitely works for the isinstance
check, and I will include it for when that's appropriate.  Without
going into too much mundane details, it's not quite flexible enough
for my other needs.  I suppose I could have a barrage of if tests with
each test in the parent class, but it would be ugly...

>
> I think you should follow (or do something similar to) the approach given
> under the section 'Creating many similar tests' in the Numpy guidelines
> under Tips & Tricks:
> http://projects.scipy.org/numpy/wiki/TestingGuidelines
>
> This gives the option to group tests from related models together.
>
> Bruce
>

Thanks as well.  The tests I had were similar to the example.  The
difference is that the data is defined for each family (test subclass)
of the same model (parent test class).  I have moved some of the
asserts now to the subclasses like the example, so I can decorate/skip
each one as needed (this isn't quite as haphazard as it sounds), but
this somewhat defeats the purpose of having the parent class with the
tests to reuse in the first place.  It's possible that my needs won't
allow me to be as lazy as I wanted to be ;).  I really wish I could
just define the TestConditions "on the fly" how I was originally
thinking, but I'm not sure even this would have worked given that I
couldn't do it simply with the explicit if tests.

Once I finish this round of refactoring perhaps I will link to the
tests so that what's going on is clearer and to see if anyone
can/wants to point out a better way.

Thanks,

Skipper


More information about the Scipy-dev mailing list