[SciPy-User] scipy.interpolate.rbf: how is "smooth" defined?
josef.pktd@gmai...
josef.pktd@gmai...
Mon Aug 30 07:25:09 CDT 2010
On Mon, Aug 30, 2010 at 7:10 AM, Mischa Schirmer
<mischa@astro.uni-bonn.de> wrote:
> Hello,
>
> I'm doing a 2D fit using scipy.interpolate.rbf.
> I have no problem with the fit itself, it works fine.
>
> Data points are randomly scattered in a x-y plane,
> and have a z-value each. The data is fairly well
> behaved, in the sense that variations across-x-y
> plane are slow. The data is somewhat noisy, and
> thus I want to smooth it.
>
> rbf has a 'smooth' parameter which is working well,
> but it is exteremly poorly documented. Basically,
> for smooth=0 no smoothing takes place, and for
> smooth>0 some smoothing takes place.
smooth<0 also works, and, I think, is the correct sign for gaussian
>
> Does anybody know how large the smoothing length
> is? Is it internally modified depending on the
> number of nodes entering the fit?
No, smoothing factor is directly used.
My intuition mainly for gaussian case: the estimate depends on the
data and a prior that biases towards zero. The matrix in the normal
equations are a weighted average (except negative sign) between data
(kernel matrix) with weight 1, and an identity matrix with weight
smooth.
It's a bit similar to Ridge Regression, where relative weights can be
selected depending on the data (but often just given as parameter that
depends on the "prior beliefs").
I think, it's the usual bandwidth selection problem and scipy offers
no support for this (until someone adds it).
Also, I don't know the rbf literature, so I have no idea if there are
any easy or standard solutions for choosing the smoothing parameter,
but there should be bandwidth selection criteria analogous to other
smoothers.
>
> I'm asking because the number of nodes in my data
> can vary from a few 10s to more than 10000.
> Clearly, in the latter case I can do with a much
> smaller smoothing scale, but I'd like to determine
> it in some automatic fashion.
Is this true for all radial basis functions ?
If I remember correctly, I had to smooth more with many points, maybe
it depends on the density of points and the amount of noise.
Josef
>
> It appears that the smoothing length is
> somehow normalised. At the moment I set 'smooth'
> to 10% of the average distance between nodes
> (in data units), resulting in a (numerically)
> much larger effective smoothing scale.
>
> Any insight is much appreciated!
>
> Mischa Schirmer
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
More information about the SciPy-User
mailing list