正在加载图片...
2.Mathematics of PCA A 2-D facial image can be represented as 1-D vector by concatenating each row (or column)into a long thin vector.Let's suppose we have M vectors of size N (=rows of image x columns of image)representing a set of sampled images.pi's represent the pixel values. xi=p1.…pw],i=1,.,M (1) The images are mean centered by subtracting the mean image from each image vector.Let m represent the mean image. 1 M m三M名 (2) And let wi be defined as mean centered image 心=Ci-m (3) Our goal is to find a set of ei's which have the largest possible projection onto each of the wi's. We wish to find a set of M orthonormal vectors e;for which the quantity (4) n=1 is maximized with the orthonormality constraint eTer fue (5) It has been shown that the ei's and Ai's are given by the eigenvectors and eigenvalues of the covariance matrix C=WWT (6) where W is a matrix composed of the column vectors wi placed side by side.The size of C is N x N which could be enormous.For example,images of size 64 x 64 create the covariance matrix of size 4096 x 4096.It is not practical to solve for the eigenvectors of C directly.A common theorem in linear algebra states that the vectors e;and scalars A;can be obtained by solving for the eigenvectors and eigenvalues of the Mx M matrix WTW.Let di and be the eigenvectors and eigenvalues of WTW,respectively. WTWdi=uidi (7) By multiplying left to both sides by W wwT(Wdi)=ui(Wdi) (8) which means that the first M-1 eigenvectors e and eigenvalues Ai of WWT are given by Wd and ui,respectively.Wdi needs to be normalized in order to be equal to ei.Since we only sum up a finite number of image vectors,M,the rank of the covariance matrix cannot exceed M-1 (The -1 come from the subtraction of the mean vector m). 22. Mathematics of PCA A 2-D facial image can be represented as 1-D vector by concatenating each row (or column) into a long thin vector. Let’s suppose we have M vectors of size N (= rows of image × columns of image) representing a set of sampled images. pj ’s represent the pixel values. xi = [p1 . . . pN ] T , i = 1, . . . , M (1) The images are mean centered by subtracting the mean image from each image vector. Let m represent the mean image. m = 1 M X M i=1 xi (2) And let wi be defined as mean centered image wi = xi − m (3) Our goal is to find a set of ei ’s which have the largest possible projection onto each of the wi ’s. We wish to find a set of M orthonormal vectors ei for which the quantity λi = 1 M X M n=1 (e T i wn) 2 (4) is maximized with the orthonormality constraint e T l ek = δlk (5) It has been shown that the ei ’s and λi ’s are given by the eigenvectors and eigenvalues of the covariance matrix C = WWT (6) where W is a matrix composed of the column vectors wi placed side by side. The size of C is N × N which could be enormous. For example, images of size 64 × 64 create the covariance matrix of size 4096×4096. It is not practical to solve for the eigenvectors of C directly. A common theorem in linear algebra states that the vectors ei and scalars λi can be obtained by solving for the eigenvectors and eigenvalues of the M × M matrix WTW. Let di and µi be the eigenvectors and eigenvalues of WTW, respectively. WTW di = µidi (7) By multiplying left to both sides by W WWT (W di) = µi(W di) (8) which means that the first M − 1 eigenvectors ei and eigenvalues λi of WWT are given by W di and µi , respectively. W di needs to be normalized in order to be equal to ei . Since we only sum up a finite number of image vectors, M, the rank of the covariance matrix cannot exceed M − 1 (The −1 come from the subtraction of the mean vector m). 2
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有