[SciPy-User] [ANN] scikit.statsmodels 0.2.0 release
Thu Feb 18 20:28:24 CST 2010
On Thu, Feb 18, 2010 at 6:28 PM, Robert Kern <firstname.lastname@example.org> wrote:
> On Thu, Feb 18, 2010 at 17:23, <email@example.com> wrote:
>> hit the wrong button
>> On Thu, Feb 18, 2010 at 5:34 PM, <firstname.lastname@example.org> wrote:
>>> On Thu, Feb 18, 2010 at 5:30 PM, Gael Varoquaux
>>> <email@example.com> wrote:
>>>> On Thu, Feb 18, 2010 at 05:24:58PM -0500, David Warde-Farley wrote:
>>>>> On 16-Feb-10, at 2:14 PM, Skipper Seabold wrote:
>>>>> > * Added four discrete choice models: Poisson, Probit, Logit, and
>>>>> > Multinomial Logit.
>> They are still new, so some problems might still be lurking around.
>>>>> Awesome. I look forward to trying these out (by the way, do you
>>>>> support any regularization methods? L2/L1?)
>> L2 Tychonov style penalization is planned for specific models, eg.
>> generalized/Bayesian Ridge Regression, Vector Autoregressive
>> Regression with dummy variable priors and other shrinkage estimators
>> when there are too many parameters to estimate.
>> There are no plans for Lasso, although I would like LARS as a
>> complement to Principal Component Regression, but it's not high on the
>> priority list.
Funny this should come up. I wasn't even really aware of this
implementation- and application-wise until I saw the paper Robert
mentions below on Andy Gelman's blog and started reading up. I was
just reading this afternoon about method of regularization as part of
the generalized entropy framework, which is somewhat supported by the
scipy.maxentropy module right now (for smoothing with a Gaussian
prior). For my end, I will be more likely to have examples of
discrete choice models (a general class of the logit at least) with
regularization in the maxent framework before anything else given my
work this semester.
As always code donations to statsmodels are welcomed and encouraged...
> There's been a recent paper that might be of interest. The associated
> code is GPLed, alas.
> We develop fast algorithms for estimation of generalized linear models
> with convex penalties. The models include linear regression, two-class
> logistic regression, and multi- nomial regression problems while the
> penalties include L1 (the lasso), L2 (ridge regression) and mixtures
> of the two (the elastic net). The algorithms use cyclical coordinate
> descent, computed along a regularization path. The methods can handle
> large problems and can also deal efficiently with sparse features. In
> comparative timings we find that the new algorithms are considerably
> faster than competing methods.
> Robert Kern
> "I have come to believe that the whole world is an enigma, a harmless
> enigma that is made terrible by our own mad attempt to interpret it as
> though it had an underlying truth."
> -- Umberto Eco
> SciPy-User mailing list
More information about the SciPy-User