[Scipy-tickets] [SciPy] #1857: Time rescaling functions for point processes
SciPy Trac
scipy-tickets@scipy....
Fri Mar 1 15:59:29 CST 2013
#1857: Time rescaling functions for point processes
-------------------------+--------------------------------------------------
Reporter: itissid | Owner: rgommers
Type: enhancement | Status: new
Priority: normal | Milestone: Unscheduled
Component: scipy.stats | Version: 0.11.0
Keywords: |
-------------------------+--------------------------------------------------
I wish to contribute to scipy. Starting with a proposal for goodness of
fit tests for point processes that I could not find existent in the python
stats community.
Here is the reference paper: http://www.stat.columbia.edu/~liam/teaching
/neurostat-spr12/papers/brown-et-al/time-rescaling.pdf
Proposal: The new feature will test for goodness of fit test for models of
point processes that require time rescaling (Think of them as poisson
processes with rate parameter varying with time and conditioned on
history. Poisson's rate function is memoryless).
The time-rescaling theorem is a well- known result in probability theory,
which states that any point process with an integrable conditional
intensity function may be transformed into a Poisson process with unit
rate. The theorem leads to a natural goodness of fit tests between a model
and a point process data train.
Requirements:
1) Implement an interface to rescale and transform some canonical
intensity functions as described in the paper. The intensity functions
chosen to be implemented are circumspect but a non parametric ones can be
used by any one. E.G. density estimates.
2) Provide a method that performs a goodness of fit test and/or Q-Q plots
as utility functions.
3) A typical simulator to simulate a point process for testing and
inspiration.
A first stab Interface specification:
{{{
"""This interface is adapted to the specifications of the paper
http://www.stat.columbia.edu/~liam/teaching/neurostat-spr12/papers/brown-
et-al/time-rescaling.pdf the interface describes how the spike trains are
transformed and rescaled and in the end
compute_uniform_from_exponential_tau function spits out variates which can
be tested with a KS test for goodness of fit
"""
class AbstractIntensityFunction(object):
def __init__(self, intensity_function):
self.intensity_function = intensity_function
"""The spike_train must be a set of spike event times
0 <=u1,u2.... uN< T for a spike.
"""
def integral_transform(self, spike_train):
raise NotImplementedError( "Should have implemented this"
)
"""Computes the exponential R.V. tau(k) = Lambda(u(k)) -
Lambda(u(k-1))
where Lambda was derived from the integral_transform() function
above.
The R.V are iid exponentially distributed according to theorem 2.1
"""
def compute_rescaled_times(self):
raise NotImplementedError( "Should have implemented this"
)
"""Based on the time scaling theorem that since X is an
exponential R.V.
then the function F(X) = 1- exp(-X) gives samples in U(0,1).
For reference look at Probability integral transforms:
http://en.wikipedia.org/wiki/Probability_integral_transform.
This function returns U(0,1) from Tau(k) computed in
compute_rescaled_time()
"""
def compute_uniform_from_exponential_tau(self):
raise NotImplementedError( "Should have implemented this"
)
"""Performs a Kolmogorov Smirnov goodness of fit between the
rescaled
and transformed data. Or could alternatively do a QQ plot as
described
in the paper"""
def fit(self):
raise NotImplementedError( "Should have implemented this"
)
}}}
Hoping for recommendations and inspirations.
--
Ticket URL: <http://projects.scipy.org/scipy/ticket/1857>
SciPy <http://www.scipy.org>
SciPy is open-source software for mathematics, science, and engineering.
More information about the Scipy-tickets
mailing list