[SciPy-User] variable smoothing kernel
Sat Mar 26 20:54:58 CDT 2011
Well, your time is my wavelength. It should vary on wavelength.
I agree implementing it with cython makes it probably faster. I do
suspect however, that it won't be as fast as the normal smoothing function.
I have learned that multiplying functions in fourier space is the same
as convoluting them. I believe that is how the ndimage kernels work so
I wanted to see if there's a similar shortcut for a variable kernel.
I have copied my previous attempts (which were very simply written and
take a long time) into this pastebin: http://pastebin.com/KkcEATs7
Thanks for your help
On 27/03/11 12:09 PM, Nicolau Werneck wrote:
> If I understand correctly, you want a filter that varies on "time".
> This non-linearity will cause it to be inherently more complicated to
> calculate than a normal linear time-invariant filter.
> I second Christopher's suggestion, try Cython out, it's great for this
> kind of thing. Or perhaps scipy.weave.
> On Sat, Mar 26, 2011 at 9:52 AM, Wolfgang Kerzendorf
> <firstname.lastname@example.org> wrote:
>> I'm interested in having a gaussian smoothing where the kernel depends
>> (linearly in this case) on the index where it is operating on. I
>> implemented it myself (badly probably) and it takes for ever, compared
>> to the gaussian smoothing with a fixed kernel in ndimage.
>> I could interpolate the array to be smoothed onto a log space and not
>> change the kernel, but that is complicated and I'd rather avoid it.
>> Is there a good way of doing that?
>> SciPy-User mailing list
More information about the SciPy-User