正在加载图片...
866 Chapter 19.Partial Differential Equations The Gauss-Seidel method,equation (19.5.6),corresponds to the matrix de- composition (+D):x)=-Uxr-1)+b (19.5.13) The fact that L is on the left-hand side of the equation follows from the updating in place,as you can easily check if you write out(19.5.13)in components.One can show [1-3]that the spectral radius is just the square of the spectral radius of the Jacobi method.For our model problem,therefore, π2 p≈1- (19.5.14) rpJIn10 1 T2 (19.5.15) The factor of two improvement in the number of iterations over the Jacobi method still leaves the method impractical. RECIPESI 2 Successive Overrelaxation(SOR) We get a better algorithm-one that was the standard algorithm until the 1970s -if we make an overcorrection to the value of x(r)at the rth stage of Gauss-Seidel iteration,thus anticipating future corrections.Solve(19.5.13)for x(),add and 9 subtract x(r-1)on the right-hand side,and hence write the Gauss-Seidel method as xo)=xr-1)-(L+D)-1.[(L+D+U)·xr-)-b (19.5.16) 32 The term in square brackets is just the residual vector g(-1),so xr)=xr-1)-(L+D)-1.r-1) (19.5.17) Now overcorrect,defining xr)=xr-)-w(L+D)-1.r-1) (19.5.18) Numerical Here w is called the overrelaxation parameter,and the method is called successive -431 overrelaxation (SOR). (outside Recipes The following theorems can be proved [1-3]: .The method is convergent only for 0<w<2.If0<w<1,we speak of underrelaxation. North Under certain mathematical restrictions generally satisfied by matrices arising from finite differencing,only overrelaxation(1<w<2 )can give faster convergence than the Gauss-Seidel method. .If pJacobi is the spectral radius of the Jacobi iteration (so that the square of it is the spectral radius of the Gauss-Seidel iteration),then the optimal choice for w is given by 2 W= (19.5.19) 1+V1-Pjacobi866 Chapter 19. Partial Differential Equations Permission is granted for internet users to make one paper copy for their own personal use. Further reproduction, or any copyin Copyright (C) 1988-1992 by Cambridge University Press. Programs Copyright (C) 1988-1992 by Numerical Recipes Software. Sample page from NUMERICAL RECIPES IN C: THE ART OF SCIENTIFIC COMPUTING (ISBN 0-521-43108-5) g of machine￾readable files (including this one) to any server computer, is strictly prohibited. To order Numerical Recipes books or CDROMs, visit website http://www.nr.com or call 1-800-872-7423 (North America only), or send email to directcustserv@cambridge.org (outside North America). The Gauss-Seidel method, equation (19.5.6), corresponds to the matrix de￾composition (L + D) · x(r) = −U · x(r−1) + b (19.5.13) The fact that L is on the left-hand side of the equation follows from the updating in place, as you can easily check if you write out (19.5.13) in components. One can show [1-3] that the spectral radius is just the square of the spectral radius of the Jacobi method. For our model problem, therefore, ρs 1 − π2 J2 (19.5.14) r pJ2 ln 10 π2 1 4 pJ2 (19.5.15) The factor of two improvement in the number of iterations over the Jacobi method still leaves the method impractical. Successive Overrelaxation (SOR) We get a better algorithm — one that was the standard algorithm until the 1970s — if we make an overcorrection to the value of x(r) at the rth stage of Gauss-Seidel iteration, thus anticipating future corrections. Solve (19.5.13) for x (r), add and subtract x(r−1) on the right-hand side, and hence write the Gauss-Seidel method as x(r) = x(r−1) − (L + D) −1 · [(L + D + U) · x(r−1) − b] (19.5.16) The term in square brackets is just the residual vector ξ (r−1), so x(r) = x(r−1) − (L + D) −1 · ξ(r−1) (19.5.17) Now overcorrect, defining x(r) = x(r−1) − ω(L + D) −1 · ξ(r−1) (19.5.18) Here ω is called the overrelaxation parameter, and the method is called successive overrelaxation (SOR). The following theorems can be proved [1-3]: • The method is convergent only for 0 <ω< 2. If 0 <ω< 1, we speak of underrelaxation. • Under certain mathematical restrictions generally satisfied by matrices arising from finite differencing, only overrelaxation (1 <ω< 2 ) can give faster convergence than the Gauss-Seidel method. • If ρJacobi is the spectral radius of the Jacobi iteration (so that the square of it is the spectral radius of the Gauss-Seidel iteration), then the optimal choice for ω is given by ω = 2 1 + 1 − ρ2 Jacobi (19.5.19)
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有