正在加载图片...
12 50 120 Figure 2.Rank 12,50and 120 appaimaons598cr phoo of Gene Golub omputational work than gaussian elimina the scaled So far in this column i have hardly tion but it has impeceable numerical are the scores PCas are usuall ioned eie I wanted to show that it rties.You can iudge whether the sin osible to discuss alues with ueh to be ree ctors of the c atrix.AA".but of course.the egligible,and if they are,the rel- wo are o evant singular system vmmetric and positive definite its sin Let E denote the outer product of the SVD and matrix approximation are often values and eigenyalues are equal and its left k-th left and right singular vectors,that is illustrated by approximating images.Our and right singular vectors are equal to each example starts with the photo on Gene other and to its eigenvectors.More gener- E =uv Golub's Web page(Figure 2).The image ally,the singular values of A are the square is 897-by-598 pixels.We stack the red oots of the eigenvalues of ATA or AA Then A can be expressed as a sum of rank-1 green,and blue JPEG components verti matrices, to produce a 2691-by-598 matrix rix is regarded a transformation from e then do jus SVD e to a c A- with pos to RGB 12h e inte are and f- spa dinar and tr ally if at th ntial Google finds over 3.000,000 Web pages tion to the orieinal matrix.The error in the the orieinal imaee with rank 50.you that mention"singular value decomposi- roximation depends upon the magnitude can begin to read the mathematics on the of the neglected singular values Whenyou white board behind Gene.With rank 120 TMALAN do this with a matrix of data that has been the image is almost indistinguishable fron of these pages before I started to write this centerd by subtracting the mean of each the full rank 598.(This is not a particularly column.I came across some other interest- column from the entire the pro- effective image compression technique.In ng ones as I surted around ess is known as pricpal comporent analy Professor SVD made all of this,and much sis (PCA).The right singular vectors,v,are more,possible.Thanks,Gene. Reprinted from The MathWarks News&Notes Oclober 2006 www.mathworks.com Reprinted from T heMathWorksNews&Notes | October 2006 | www.mathworks.com computational work than Gaussian elimina￾tion, but it has impeccable numerical prop￾erties. You can judge whether the singular values are small enough to be regarded as negligible, and if they are, analyze the rel￾evant singular system. Let Ek denote the outer product of the k-th left and right singular vectors, that is Ek = uk vk T Then A can be expressed as a sum of rank-1 matrices, n A = ∑σk Ek k=1 If you order the singular values in decreasing order, σ1 > σ2 > ... > σn , and truncate the sum after r terms, the result is a rank-r approxima￾tion to the original matrix. The error in the approximation depends upon the magnitude of the neglected singular values. When you do this with a matrix of data that has been centered, by subtracting the mean of each column from the entire column, the pro￾cess is known as principal component analy￾sis (PCA). The right singular vectors, vk , are Figure 2. Rank 12, 50, and 120 approximations to a rank 598 color photo of Gene Golub. the components, and the scaled left singular vectors, σk uk , are the scores. PCAs are usually described in terms of the eigenvalues and ei￾genvectors of the covariance matrix, AAT, but the SVD approach sometimes has better nu￾merical properties. SVD and matrix approximation are often illustrated by approximating images. Our example starts with the photo on Gene Golub’s Web page (Figure 2). The image is 897-by-598 pixels. We stack the red, green, and blue JPEG components verti￾cally to produce a 2691-by-598 matrix. We then do just one SVD computation. After computing a low-rank approxima￾tion, we repartition the matrix into RGB components. With just rank 12, the colors are accurately reproduced and Gene is recognizable, especially if you squint at the picture to allow your eyes to reconstruct the original image. With rank 50, you can begin to read the mathematics on the white board behind Gene. With rank 120, the image is almost indistinguishable from the full rank 598. (This is not a particularly effective image compression technique. In fact, my friends in image processing call it “image degradration.” ) So far in this column I have hardly men￾tioned eigenvalues. I wanted to show that it is possible to discuss singular values without discussing eigenvalues—but, of course, the two are closely related. In fact, if A is square, symmetric, and positive definite, its singular values and eigenvalues are equal, and its left and right singular vectors are equal to each other and to its eigenvectors. More gener￾ally, the singular values of A are the square roots of the eigenvalues of AT A or AAT. Singular values are relevant when the ma￾trix is regarded as a transformation from one space to a different space with pos￾sibly different dimensions. Eigenvalues are relevant when the matrix is regarded as a transfor­mation from one space into itself—as, for example, in linear ordinary differential equations. Google finds over 3,000,000 Web pages that mention “singular value decomposi￾tion” and almost 200,000 pages that men￾tion “SVD MATLAB.” I knew about a few of these pages before I started to write this column. I came across some other interest￾ing ones as I surfed around. Professor SVD made all of this, and much more, possible. Thanks, Gene. 7 12 50 120
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有