[SciPy-User] Kurtosis/Skewness

josef.pktd@gmai... josef.pktd@gmai...
Tue Mar 30 16:40:41 CDT 2010


On Tue, Mar 30, 2010 at 5:32 PM, nicky van foreest <vanforeest@gmail.com> wrote:
>> There are an infinite number of distributions that will have the same
>> skewness and kurtosis. However, it is reasonable to search for the
>> maximum entropy distribution satisfying those constraints. The normal
>> distribution is the maximum entropy distribution for a fixed mean and
>> variance.
>>
>> http://en.wikipedia.org/wiki/Maximum_entropy_probability_distribution
>>
>> The PDF will have the form:
>>
>>  pdf(x) = c * exp(- lagrange * (x ** arange(1, 5)))
>>
>> c is just the normalizing constant. You will have to find the lagrange
>> parameters that satisfy the mean, variance, skewness and kurtosis.
>> Sampling from this distribution will be tricky, though. You will have
>> to resort to general methods that are going to be pretty slow.
>
>
> This is of course a very good suggestion. However, mind that this
> claim is only true if the support of your desired distribution is the
> entire real axis. I recall that I once tried to find the maximum
> entropy distribution with given mean and variance, but such that the
> support was the positive reals (including 0), rather then the entire
> real line. This was less easy then I initially thought, and it is
> certainly not the normal distribution.

If you want to stay close to the normal distribution, then the main
two general expansions that I have seen, are the Edgeworth expansion
(I think based on Taylor series expansion but I'm not sure) and the
Gram-Charlier expansion which is based on Hermite polynomials.

For skew distributions, e.g. skew-normal, skew-t there are several
version in the literature (Azalini?).

Josef

>
> bye
>
> Nicky
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>


More information about the SciPy-User mailing list