Using Eigen3,
Is there a more efficient way to calculate PCA? mat is the matrix of raw data:
MatrixXd centered = mat.rowwise() - mat.colwise().mean();
MatrixXd cov = centered.adjoint() * centered;
SelfAdjointEigenSolver<MatrixXd> decomp(cov);
I keep reading that adjoint is an expensive method. Thanks
CodePudding user response:
adjoint
itself is cheap. Evaluating adjoint
into a matrix is slightly expensive since it has to transpose; but that's not what you are doing. I'd like to see where this claim comes from. Maybe Numpy? Because there conjugation and adjoint is pretty expensive since Numpy doesn't have adjoint or conjugated views.
However, there is a simple thing to improve your code: Your matrix multiplication can be handled as a special case called rankUpdate. The main point is that it only updates one triangular part of the matrix since the other triangle is redundant.
Eigen::MatrixXd cov = Eigen::MatrixXd::Zero(centered.cols(), centered.cols());
cov.selfadjointView<Eigen::Lower>().rankUpdate(cov.adjoint());
SelfAdjointEigenSolver<MatrixXd> decomp(cov);
The documentation of SelfAdjointEigenSolver
states that only the lower triangular part is used. So that's what we compute.