[Numpy-discussion] performance matrix multiplication vs. matlab

David Cournapeau david@ar.media.kyoto-u.ac...
Tue Jun 9 22:55:52 CDT 2009


David Warde-Farley wrote:
> On 9-Jun-09, at 3:54 AM, David Cournapeau wrote:
>
>   
>> For example, what ML people call PCA is called Karhunen Loéve in  
>> signal
>> processing, and the concepts are quite similar.
>>     
>
>
> Yup. This seems to be a nice set of review notes:
>
> 	http://www.ece.rutgers.edu/~orfanidi/ece525/svd.pdf
>   

This looks indeed like a very nice review from a signal processing
approach. I never took the time to understand the
similarities/differences/connections between traditional SP approaches
and the machine learning approach. I wonder if the subspaces methods ala
PENCIL/MUSIC and co have a (useful) interpretation in a more ML
approach, I never really thought about it. I guess other people had :)

> And going further than just PCA/KLT, tying it together with maximum  
> likelihood factor analysis/linear dynamical systems/hidden Markov  
> models,
>
> 	http://www.cs.toronto.edu/~roweis/papers/NC110201.pdf
>   

As much as I like this paper, I always felt that you miss a lot of
insights when considering PCA only from a purely statistical POV. I
really like the consideration of PCA within a function approximation POV
(the chapter 9 of the Mallat book on wavelet is cristal clear, for
example, and it is based on all those cool functional spaces theory
likes Besov space).

cheers,

David


More information about the Numpy-discussion mailing list