正在加载图片...
36 Chapter 2.Solution of Linear Algebraic Equations Coleman,T.F.,and Van Loan,C.1988.Handbook for Matrix Computations(Philadelphia:S.I.A.M.). Forsythe,G.E.,and Moler,C.B.1967,Computer Solution of Linear Algebraic Systems(Engle- wood Cliffs,NJ:Prentice-Hall). Wilkinson,J.H.,and Reinsch,C.1971,Linear Algebra,vol.ll of Handbook for Automatic Com- putation (New York:Springer-Verlag). Westlake,J.R.1968,A Handbook of Numerical Matrix Inversion and Solution of Linear Equations (New York:Wiley). Johnson,L.W.,and Riess,R.D.1982,Numerical Analysis,2nd ed.(Reading,MA:Addison- Wesley).Chapter 2. Ralston.A..and Rabinowitz.P.1978,A First Course in Numerical Analysis,2nd ed.(New York: McGraw-Hill),Chapter 9. 81 2.1 Gauss-Jordan Elimination For inverting a matrix,Gauss-Jordan elimination is about as efficient as any other method.For solving sets of linear equations,Gauss-Jordan elimination ⊙ produces both the solution of the equations for one or more right-hand side vectors b,and also the matrix inverse A.However,its principal weaknesses are(i)that it 9 requires all the right-hand sides to be stored and manipulated at the same time,and (ii)that when the inverse matrix is not desired,Gauss-Jordan is three times slower than the best alternative technique for solving a single linear set(82.3).The method's principal strength is that it is as stable as any other direct method,perhaps even a bit more stable when full pivoting is used(see below). waea If you come along later with an additional right-hand side vector,you can multiply it by the inverse matrix,of course.This does give an answer,but one that is quite susceptible to roundofferror,not nearly as good as if the new vector had been 6 included with the set of right-hand side vectors in the first instance. For these reasons,Gauss-Jordan elimination should usually not be your method of first choice,either for solving linear equations or for matrix inversion.The decomposition methods in $2.3 are better.Why do we give you Gauss-Jordan at all? Because it is straightforward,understandable,solid as a rock,and an exceptionally 量、 Numerica 10621 good"psychological"backup for those times that something is going wrong and you think it might be your linear-equation solver. 4310 Some people believe that the backup is more than psychological,that Gauss- Jordan elimination is an"independent"numerical method.This turns out to be Recipes mostly myth.Except for the relatively minor differences in pivoting,described below,the actual sequence of operations performed in Gauss-Jordan elimination is North very closely related to that performed by the routines in the next two sections. For clarity,and to avoid writing endless ellipses(..)we will write out equations only for the case of four equations and four unknowns,and with three different right- hand side vectors that are known in advance.You can write bigger matrices and extend the equations to the case of N x N matrices,with M sets of right-hand side vectors,in completely analogous fashion.The routine implemented below is. of course,general.36 Chapter 2. Solution of Linear Algebraic Equations Permission is granted for internet users to make one paper copy for their own personal use. Further reproduction, or any copyin Copyright (C) 1988-1992 by Cambridge University Press. Programs Copyright (C) 1988-1992 by Numerical Recipes Software. Sample page from NUMERICAL RECIPES IN C: THE ART OF SCIENTIFIC COMPUTING (ISBN 0-521-43108-5) g of machine￾readable files (including this one) to any server computer, is strictly prohibited. To order Numerical Recipes books or CDROMs, visit website http://www.nr.com or call 1-800-872-7423 (North America only), or send email to directcustserv@cambridge.org (outside North America). Coleman, T.F., and Van Loan, C. 1988, Handbook for Matrix Computations (Philadelphia: S.I.A.M.). Forsythe, G.E., and Moler, C.B. 1967, Computer Solution of Linear Algebraic Systems (Engle￾wood Cliffs, NJ: Prentice-Hall). Wilkinson, J.H., and Reinsch, C. 1971, Linear Algebra, vol. II of Handbook for Automatic Com￾putation (New York: Springer-Verlag). Westlake, J.R. 1968, A Handbook of Numerical Matrix Inversion and Solution of Linear Equations (New York: Wiley). Johnson, L.W., and Riess, R.D. 1982, Numerical Analysis, 2nd ed. (Reading, MA: Addison￾Wesley), Chapter 2. Ralston, A., and Rabinowitz, P. 1978, A First Course in Numerical Analysis, 2nd ed. (New York: McGraw-Hill), Chapter 9. 2.1 Gauss-Jordan Elimination For inverting a matrix, Gauss-Jordan elimination is about as efficient as any other method. For solving sets of linear equations, Gauss-Jordan elimination produces both the solution of the equations for one or more right-hand side vectors b, and also the matrix inverse A−1. However, its principal weaknesses are (i) that it requires all the right-hand sides to be stored and manipulated at the same time, and (ii) that when the inverse matrix is not desired, Gauss-Jordan is three times slower than the best alternative technique for solving a single linear set (§2.3). The method’s principal strength is that it is as stable as any other direct method, perhaps even a bit more stable when full pivoting is used (see below). If you come along later with an additional right-hand side vector, you can multiply it by the inverse matrix, of course. This does give an answer, but one that is quite susceptible to roundoff error, not nearly as good as if the new vector had been included with the set of right-hand side vectors in the first instance. For these reasons, Gauss-Jordan elimination should usually not be your method of first choice, either for solving linear equations or for matrix inversion. The decomposition methods in §2.3 are better. Why do we give you Gauss-Jordan at all? Because it is straightforward, understandable, solid as a rock, and an exceptionally good “psychological” backup for those times that something is going wrong and you think it might be your linear-equation solver. Some people believe that the backup is more than psychological, that Gauss￾Jordan elimination is an “independent” numerical method. This turns out to be mostly myth. Except for the relatively minor differences in pivoting, described below, the actual sequence of operations performed in Gauss-Jordan elimination is very closely related to that performed by the routines in the next two sections. For clarity, and to avoid writing endless ellipses (···) we will write out equations only for the case of four equations and four unknowns, and with three different right￾hand side vectors that are known in advance. You can write bigger matrices and extend the equations to the case of N × N matrices, with M sets of right-hand side vectors, in completely analogous fashion. The routine implemented below is, of course, general
<<向上翻页
©2008-现在 cucdc.com 高等教育资讯网 版权所有