[SciPy-user] getting the "best" two eigenvectors for a PCA
analysis with a power method
noel.oboyle2 at mail.dcu.ie
Wed Aug 31 05:37:35 CDT 2005
R calculates the PCs using singular value decomposition, instead of
using the eigenvalues of the covariance matrix. From the R manual for
"The calculation is done by a singular value decomposition of the
(centered and possibly scaled) data matrix, not by using 'eigen'
on the covariance matrix. This is generally the preferred method
for numerical accuracy."
(1) Can anyone comment on what numerical accuracy means in this context,
and whether one should really care about this for principal component
On Mon, 2005-08-29 at 09:23 +0200, Tiziano Zito wrote:
> On Fri 26 Aug, 18:46, Syd Diamond wrote:
> > I'll be honest -- I'm mostly interested in the end not the means right
> > now. I need to find the first and second principal components of a
> > 120x520 matrix.
> The package 'MDP' http://mdp-toolkit.sourceforge.net together with
> 'symeig' http://mdp-toolkit.sourceforge.net/symeig.html offers taht
> functionality (and much more). Note that results are much faster
> than standard eigenvalue problem solution with scipy.linalg.eig.
> After installing MDP and symeig you can simply do:
> >>> import mdp
> >>> out = mdp.pca(input_data, output_dim = 2)
> SciPy-user mailing list
> SciPy-user at scipy.net
More information about the SciPy-user