正在加载图片...
11 has the special property that all of its eigenvectors are not just All of these properties arise from the dot product of any two linearly independent but also orthogonal,thus completing our vectors from this set. proof. (X)(X)=(X)T(X) In the first part of the proof,let A be just some matrix,not necessarily symmetric,and let it have independent eigenvec- =好XTX9 tors (i.e.no degeneracy).Furthermore,let E=[ei e2...enl =(0) be the matrix of eigenvectors placed in the columns.Let D be a diagonal matrix where theheigenvalue is placed in the ih =入 position. (X1)-(X)=δ We will now show that AE=ED.We can examine the columns of the right-hand and left-hand sides of the equation. The last relation arises because the set of eigenvectors of X is orthogonal resulting in the Kronecker delta.In more simpler Left hand side:AE [AeI Ae2...Aen] terms the last relation states: Right hand side:ED [ieie2...Anen] J入i=j Evidently,if AE=ED then Aei=Aiei for all i.This equa- (X-(X)={0i≠j tion is the definition of the eigenvalue equation.Therefore, it must be that AE=ED.A little rearrangement provides This equation states that any two vectors in the set are orthog- A=EDE-1,completing the first part the proof. onal. For the second part of the proof,we show that a symmetric The second property arises from the above equation by realiz- ing that the length squared of each vector is defined as: matrix always has orthogonal eigenvectors.For some sym- metric matrix,let入,and入2 be distinct eigenvalues for eigen IXl2=(X)·(X)=2 vectors el and e2. 1e1e2=(1e)Te2 APPENDIX B:Code =(Ae1)Te2 =eTATe2 This code is written for Matlab 6.5 (Release 13)from =elAe2 Mathworks8. The code is not computationally effi- =eT(2e2) cient but explanatory (terse comments begin with a %) λ1e·e2=2e1·e2 This first version follows Section 5 by examining the By the last relation we can equate that (-2)e1e2=0. covariance of the data set Since we have conjectured that the eigenvalues are in fact unique,it must be the case that ele2=0.Therefore,the function [signals,PC,V]pcal(data) eigenvectors of a symmetric matrix are orthogonal. PCA1:Perform PCA using covariance. Let us back up now to our original postulate that A is a sym- data -MxN matrix of input data metric matrix.By the second part of the proof,we know (M dimensions,N trials) that the eigenvectors of A are all orthonormal (we choose signals MxN matrix of projected data the eigenvectors to be normalized).This means that E is an PC-each column is a PC orthogonal matrix so by theorem 1,ET=E-!and we can V-Mx1 matrix of variances rewrite the final result. [M,N]size(data); A=EDET subtract off the mean for each dimension Thus,a symmetric matrix is diagonalized by a matrix of its mn mean(data,2); eigenvectors. data data repmat (mn,1,N); calculate the covariance matrix 5.For any arbitrary m x n matrix X,the symmetric covariance 1 /(N-1)*data data'; matrix XTX has a set of orthonormal eigenvectors of 1.2.....n and a set of associated eigenvalues find the eigenvectors and eigenvalues (1,72,...,An}.The set of vectors [X 1,X92....,Xin} then form an orthogonal basis,where each vector Xv;is of length vi. 8 http://www.mathworks.com11 has the special property that all of its eigenvectors are not just linearly independent but also orthogonal, thus completing our proof. In the first part of the proof, let A be just some matrix, not necessarily symmetric, and let it have independent eigenvec￾tors (i.e. no degeneracy). Furthermore, let E = [e1 e2 ... en] be the matrix of eigenvectors placed in the columns. Let D be a diagonal matrix where the i th eigenvalue is placed in the iith position. We will now show that AE = ED. We can examine the columns of the right-hand and left-hand sides of the equation. Left hand side : AE = [Ae1 Ae2 ... Aen] Right hand side : ED = [λ1e1 λ2e2 ... λnen] Evidently, if AE = ED then Aei = λiei for all i. This equa￾tion is the definition of the eigenvalue equation. Therefore, it must be that AE = ED. A little rearrangement provides A = EDE−1 , completing the first part the proof. For the second part of the proof, we show that a symmetric matrix always has orthogonal eigenvectors. For some sym￾metric matrix, let λ1 and λ2 be distinct eigenvalues for eigen￾vectors e1 and e2. λ1e1 · e2 = (λ1e1) T e2 = (Ae1) T e2 = e1 TA T e2 = e1 TAe2 = e1 T (λ2e2) λ1e1 · e2 = λ2e1 · e2 By the last relation we can equate that (λ1 −λ2)e1 · e2 = 0. Since we have conjectured that the eigenvalues are in fact unique, it must be the case that e1 · e2 = 0. Therefore, the eigenvectors of a symmetric matrix are orthogonal. Let us back up now to our original postulate that A is a sym￾metric matrix. By the second part of the proof, we know that the eigenvectors of A are all orthonormal (we choose the eigenvectors to be normalized). This means that E is an orthogonal matrix so by theorem 1, E T = E −1 and we can rewrite the final result. A = EDET . Thus, a symmetric matrix is diagonalized by a matrix of its eigenvectors. 5. For any arbitrary m × n matrix X, the symmetric matrix XTX has a set of orthonormal eigenvectors of {vˆ1,vˆ2,...,vˆn} and a set of associated eigenvalues {λ1,λ2,...,λn}. The set of vectors {Xvˆ1,Xvˆ2,...,Xvˆn} then form an orthogonal basis, where each vector Xvˆi is of length √ λi . All of these properties arise from the dot product of any two vectors from this set. (Xvˆi)·(Xvˆj) = (Xvˆi) T (Xvˆj) = vˆ T i X TXvˆj = vˆ T i (λjvˆj) = λjvˆi · vˆj (Xvˆi)·(Xvˆj) = λjδi j The last relation arises because the set of eigenvectors of X is orthogonal resulting in the Kronecker delta. In more simpler terms the last relation states: (Xvˆi)·(Xvˆj) =  λj i = j 0 i 6= j This equation states that any two vectors in the set are orthog￾onal. The second property arises from the above equation by realiz￾ing that the length squared of each vector is defined as: kXvˆik 2 = (Xvˆi)·(Xvˆi) = λi APPENDIX B: Code This code is written for Matlab 6.5 (Release 13) from Mathworks8 . The code is not computationally effi- cient but explanatory (terse comments begin with a %). This first version follows Section 5 by examining the covariance of the data set. function [signals,PC,V] = pca1(data) % PCA1: Perform PCA using covariance. % data - MxN matrix of input data % (M dimensions, N trials) % signals - MxN matrix of projected data % PC - each column is a PC % V - Mx1 matrix of variances [M,N] = size(data); % subtract off the mean for each dimension mn = mean(data,2); data = data - repmat(mn,1,N); % calculate the covariance matrix covariance = 1 / (N-1) * data * data’; % find the eigenvectors and eigenvalues 8 http://www.mathworks.com
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有