[SciPy-User] variable smoothing kernel
Wolfgang Kerzendorf
wkerzendorf@googlemail....
Sun Mar 27 02:16:21 CDT 2011
So I have now considered interpolating on a logarithmic spacing. This is
very fast (spectrum with 8000 points > 1 ms). I think for now this is a
good method. I have also checked the errors associated with
interpolating to log space and back. They seem to be relatively small.
Thanks for all your suggestions and help.
Cheers
Wolfgang
On 27/03/11 2:01 PM, Nicolau Werneck wrote:
> On Sun, Mar 27, 2011 at 12:54:58PM +1100, Wolfgang Kerzendorf wrote:
>> Well, your time is my wavelength. It should vary on wavelength.
> OK, but what kind of data do you have? It is a 1-dimensional signal,
> and you have taken its Fourier transform and now you are filtering in
> the frequency domain?...
>
>> I agree implementing it with cython makes it probably faster. I do
>> suspect however, that it won't be as fast as the normal smoothing function.
>>
>> I have learned that multiplying functions in fourier space is the same
>> as convoluting them. I believe that is how the ndimage kernels work so
>> incredibly fast.
>> I wanted to see if there's a similar shortcut for a variable kernel.
> Implementing in Cython will make it _definitely_ better!... Whenever
> you have large loops running over vectors or arrays Cython will give
> you great speedups.
>
> And as I was saying later, it will never be as fast because applying a
> linear filter is something inherently easier... Because you can use the FFT
> and multiply in the transform domain, as you said.
>
> In your case maybe you could consider to filter the signal with a
> filter bank, and then pick up the values from the result according to
> the formula you use for calculating your kernel. It may or may not be
> quicker, but it's not possible if you need infinite precision in the
> parameters of your filter.
>
>> I have copied my previous attempts (which were very simply written and
>> take a long time) into this pastebin: http://pastebin.com/KkcEATs7
> Thanks for sending it... But it's not clear to me how it works in a
> first glance. Can you send a small sample with a synthetic signal
> (randn, whatever) showing how to run the procedures?
>
> And question: is there any chance you could in your problem first
> apply some mapping of your signal, a change of variables (like
> x->log(x) )and then apply a normal linear time-invariant filter with
> this transform, and then apply the inverse transform? In that case you
> would first use interpolation to perform the mapping, then apply the
> fast filtering procedure, and do the inverse interpolation...
>
>
> ++nic
>
>
>> Thanks for your help
>> Wolfgang
>> On 27/03/11 12:09 PM, Nicolau Werneck wrote:
>>> If I understand correctly, you want a filter that varies on "time".
>>> This non-linearity will cause it to be inherently more complicated to
>>> calculate than a normal linear time-invariant filter.
>>>
>>> I second Christopher's suggestion, try Cython out, it's great for this
>>> kind of thing. Or perhaps scipy.weave.
>>>
>>> ++nic
>>>
>>> On Sat, Mar 26, 2011 at 9:52 AM, Wolfgang Kerzendorf
>>> <wkerzendorf@googlemail.com> wrote:
>>>> Hello,
>>>>
>>>> I'm interested in having a gaussian smoothing where the kernel depends
>>>> (linearly in this case) on the index where it is operating on. I
>>>> implemented it myself (badly probably) and it takes for ever, compared
>>>> to the gaussian smoothing with a fixed kernel in ndimage.
>>>>
>>>> I could interpolate the array to be smoothed onto a log space and not
>>>> change the kernel, but that is complicated and I'd rather avoid it.
>>>>
>>>> Is there a good way of doing that?
>>>>
>>>> Cheers
>>>> Wolfgang
>>>> _______________________________________________
>>>> SciPy-User mailing list
>>>> SciPy-User@scipy.org
>>>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>>>
>>>
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User@scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
More information about the SciPy-User
mailing list