正在加载图片...
O(N2log2N),and discovering it would change everything. In 1985 I made a bet with Peter Alfeld of the University of Utah that a matrix algorithm with complexity (N2+)for any>0would be found within ten years.None was.and I gave Alfeld a chedk for $100.We renewed our bet.however.to 2005.and ir that year I will renew it again if necessary.One morning,with ludk.the headlines will appear.I think fifty years should be long enough. The onjugate gradient and Lanczos algorithms were invented around 1950.and their story is a curious one.Nowadays we have no doubt as to what these methods are good for:they are matrix iterations.whith for certain structured matrices bring those O(N)operation counts down to O(N2)or even better.Though there are constants hidden in the "O",these methods are often much faster than Gaussian elimination and its relatives when N is large. What is curious is that Hestenes,Stiefel,Lanczos and the rest didn't see this coming In the 1950s,N was too small for conjugate gradients and Lanczos yet to be competitive but all the mathematical pieoes were in place.These men knew something of the conver gence properties of their iterations,enough to have been able to predict that eventually as machines grew faster,they must beat the competition.Yet they seem not to have made this prediction.A numerical analyst writing an essay like this one in 1960 might not have mentioned conjugate gradients at all It is with this history in mind that I mention multipole methods,by which I mear methods related to the recent algorithmsof Rokhlin and Greengardfor N-body problems and integral equations.Tim have changed,and we are all asymptoticker being inve the 1980 they we comp itive in 2D bu ately tha red peration O(N),give or take a logarithmic factor,so how could of multipole etho s will exemplify a gen eral trend. As time goes b ar go algorithms,even for pre ight in gorithm an 8.Breakthroughs will have occurred in matri precondilioners.spectral methods.and time stepping for partial differential equations. It ishardnot tob op imisticeaboutmerelytechnicadlhurdes.Tn oss of matri ital ut it s a Jungle th ely lmprov e mn store! powe this is the old p 111 for ODEs but ill for PDEs To thi the c round this traint.time ep aller thar rixO(N2 log2 N), and discovering it would change everything. In 1985 I made a bet with Peter Alfeld of the University of Utah that a matrix algorithm with complexity 0(N2+ ) for any  > 0 would be found within ten years. None was, and I gave Alfeld a check for $100. We renewed our bet, however, to 2005, and in that year I will renew it again if necessary. One morning, with luck, the headlines will appear. I think fty years should be long enough. 7. Multipole methods and their descendants wil l be ubiquitous. The conjugate gradient and Lanczos algorithms were invented around 1950, and their story is a curious one. Nowadays we have no doubt as to what these methods are good for: they are matrix iterations, which for certain structured matrices bring those O(N3 ) operation counts down to O(N2 ) or even better. Though there are constants hidden in the \O", these methods are often much faster than Gaussian elimination and its relatives when N is large. What is curious is that Hestenes, Stiefel, Lanczos and the rest didn't see this coming. In the 1950s, N was too small for conjugate gradients and Lanczos yet to be competitive, but all the mathematical pieces were in place. These men knew something of the conver￾gence properties of their iterations, enough to have been able to predict that eventually, as machines grew faster, they must beat the competition. Yet they seem not to have made this prediction. A numerical analyst writing an essay like this one in 1960 might not have mentioned conjugate gradients at all. It is with this history in mind that I mention multipole methods, by which I mean methods related to the recent algorithms of Rokhlin and Greengard for N-body problems and integral equations. Times have changed, and we are all asymptotickers. When multipole methods were being invented in the 1980s, they were competitive in 2D but not 3D. Yet Rokhlin and Greengard saw immediately that these techniques reduced operation counts from O(N2 ) to O(N), give or take a logarithmic factor, so how could they not win in the long run? And so they will. The success of multipole methods will exemplify a general trend. As time goes by, large-scale numerical computations rely more on approximate algorithms, even for prob￾lems that might in principle be solved exactly in a nite number of steps. Approximate algorithms are more robust than exact ones, and they are also often faster. 8. Breakthroughs wil l have occurred in matrix preconditioners, spectral methods, and time stepping for partial di erential equations. It is hard not to be optimistic about merely technical hurdles. The business of matrix preconditioners is vitally important, but it is a jungle these days|surely improvements are in store! Spectral methods for PDEs are in a similar state|remarkably powerful, but varying awkwardly from one application to the next. Order is needed here, and it will come. As for time-stepping, this is the old problems of sti ness, reasonably well in hand for ODEs but still unsolved in a general way for PDEs. To this day, the CFL restriction constrains our computations all across the range of science and engineering. To get around this constraint, time steps are taken smaller than we would wish, huge matrix 5
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有