[SciPy-User] scipy.interpolate.rbf sensitive to input noise ?
Mon Feb 22 10:14:23 CST 2010
On Feb 22, 3:08 pm, josef.p...@gmail.com wrote:
> On Mon, Feb 22, 2010 at 7:35 AM, denis <denis-bz...@t-online.de> wrote:
> > On Feb 19, 5:41 pm, josef.p...@gmail.com wrote:
> >> On Fri, Feb 19, 2010 at 11:26 AM, denis <denis-bz...@t-online.de> wrote:
> >> > Use "smooth" ? rbf.py just does
> >> > self.A = self._function(r) - eye(self.N)*self.smooth
> >> > and you don't know A .
> > That's a line from scipy/interpolate/rbf.py: it solves
> > (A - smooth*I)x = b instead of
> > Ax = b
> > Looks to me like a hack for A singular, plus the caller doesn't know A
> > anyway.
> It's not a hack it's a requirement, ill-posed inverse problems need
OK, I must be wrong; but (sorry, I'm ignorant) how can (A - smooth)
For gauss the eigenvalues are >= 0, many 0, so we're shifting them
Or is it a simple sign error, A + smooth ?
> penalization, this is just Ridge or Tychonov with a kernel matrix. A
> is (nobs,nobs) and the number of features is always the same as the
> number of observations that are used. (I was looking at "Kernel Ridge
> Regression" and "Gaussian Process" before I realized that rbf is
> essentially the same, at least for 'gauss')
> I don't know anything about thinplate.
> I still don't understand what you mean with "the caller doesn't know
> A". A is the internally calculated kernel matrix (if I remember
Yes that's right; how can the caller of Rbf() give a reasonable value
to solve (A - smoothI) inside Rbf, without knowing A ? A is wildly
different for gauss, linear ... too.
Or do you just shut your eyes and try 1e-6 ?
More information about the SciPy-User