正在加载图片...
7.7 Quasi-(that is,Sub-)Random Sequences 309 CITED REFERENCES AND FURTHER READING: Hammersley,J.M.,and Handscomb,D.C.1964,Monte Carlo Methods (London:Methuen). Shreider,Yu.A.(ed.)1966,The Monte Carlo Method (Oxford:Pergamon). Sobol',I.M.1974,The Monte Carlo Method(Chicago:University of Chicago Press). Kalos,M.H.,and Whitlock,P.A.1986,Monte Carlo Methods(New York:Wiley). 7.7 Quasi-(that is,Sub-)Random Sequences 三 8=g We have just seen that choosing N points uniformly randomly in an n- dimensional space leads to an error term in Monte Carlo integration that decreases g as 1/vN.In essence,each new point sampled adds linearly to an accumulated sum that will become the function average,and also linearly to an accumulated sum of squares that will become the variance (equation 7.6.2).The estimated error comes from the square root of this variance,hence the power N-1/2. Just because this square root convergence is familiar does not,however,mean 9 that it is inevitable.A simple counterexample is to choose sample points that lie on a Cartesian grid,and to sample each grid point exactly once (in whatever order). The Monte Carlo method thus becomes a deterministic quadrature scheme-albeit a simple one-whose fractional error decreases at least as fast as N-1(even faster if the function goes to zero smoothly at the boundaries of the sampled region,or is periodic in the region). 、屋号 The trouble with a grid is that one has to decide in advance how fine it should be.One is then committed to completing all of its sample points.With a grid,it is not convenient to "sample until some convergence or termination criterion is met 令 6 One might ask if there is not some intermediate scheme,some way to pick sample points"at random,"yet spread out in some self-avoiding way,avoiding the chance clustering that occurs with uniformly random points. A similar question arises for tasks other than Monte Carlo integration.We might want to search an n-dimensional space for a point where some (locally computable) Numerica 10621 condition holds.Of course,for the task to be computationally meaningful,there had better be continuity,so that the desired condition will hold in some finite n- 431 dimensional neighborhood.We may not know a priori how large that neighborhood Recipes is,however.We want to "sample until the desired point is found,moving smoothly to finer scales with increasing samples.Is there any way to do this that is better (outside than uncorrelated,random samples? North The answer to the above question is "yes."Sequences of n-tuples that fill n-space more uniformly than uncorrelated random points are called guasi-random sequences.That term is somewhat of a misnomer,since there is nothing"random' about quasi-random sequences:They are cleverly crafted to be,in fact,sub-random. The sample points in a quasi-random sequence are,in a precise sense,"maximally avoiding"of each other. A conceptually simple example is Halton's sequence [11.In one dimension,the jth number H;in the sequence is obtained by the following steps:(i)Write j as a number in base b,where b is some prime.(For examplej=17 in base b=3 is 122.) (ii)Reverse the digits and put a radix point(ie.,a decimal point base b)in front of7.7 Quasi- (that is, Sub-) Random Sequences 309 Permission is granted for internet users to make one paper copy for their own personal use. Further reproduction, or any copyin Copyright (C) 1988-1992 by Cambridge University Press. Programs Copyright (C) 1988-1992 by Numerical Recipes Software. Sample page from NUMERICAL RECIPES IN C: THE ART OF SCIENTIFIC COMPUTING (ISBN 0-521-43108-5) g of machine￾readable files (including this one) to any server computer, is strictly prohibited. To order Numerical Recipes books or CDROMs, visit website http://www.nr.com or call 1-800-872-7423 (North America only), or send email to directcustserv@cambridge.org (outside North America). CITED REFERENCES AND FURTHER READING: Hammersley, J.M., and Handscomb, D.C. 1964, Monte Carlo Methods (London: Methuen). Shreider, Yu. A. (ed.) 1966, The Monte Carlo Method (Oxford: Pergamon). Sobol’, I.M. 1974, The Monte Carlo Method (Chicago: University of Chicago Press). Kalos, M.H., and Whitlock, P.A. 1986, Monte Carlo Methods (New York: Wiley). 7.7 Quasi- (that is, Sub-) Random Sequences We have just seen that choosing N points uniformly randomly in an n￾dimensional space leads to an error term in Monte Carlo integration that decreases as 1/ √ N. In essence, each new point sampled adds linearly to an accumulated sum that will become the function average, and also linearly to an accumulated sum of squares that will become the variance (equation 7.6.2). The estimated error comes from the square root of this variance, hence the power N −1/2. Just because this square root convergence is familiar does not, however, mean that it is inevitable. A simple counterexample is to choose sample points that lie on a Cartesian grid, and to sample each grid point exactly once (in whatever order). The Monte Carlo method thus becomes a deterministic quadrature scheme — albeit a simple one — whose fractional error decreases at least as fast as N −1 (even faster if the function goes to zero smoothly at the boundaries of the sampled region, or is periodic in the region). The trouble with a grid is that one has to decide in advance how fine it should be. One is then committed to completing all of its sample points. With a grid, it is not convenient to “sample until” some convergence or termination criterion is met. One might ask if there is not some intermediate scheme, some way to pick sample points “at random,” yet spread out in some self-avoiding way, avoiding the chance clustering that occurs with uniformly random points. A similar question arises for tasks other than Monte Carlo integration. We might want to search an n-dimensional space for a point where some (locally computable) condition holds. Of course, for the task to be computationally meaningful, there had better be continuity, so that the desired condition will hold in some finite n￾dimensional neighborhood. We may not know a priori how large that neighborhood is, however. We want to “sample until” the desired point is found, moving smoothly to finer scales with increasing samples. Is there any way to do this that is better than uncorrelated, random samples? The answer to the above question is “yes.” Sequences of n-tuples that fill n-space more uniformly than uncorrelated random points are called quasi-random sequences. That term is somewhat of a misnomer, since there is nothing “random” about quasi-random sequences: They are cleverly crafted to be, in fact, sub-random. The sample points in a quasi-random sequence are, in a precise sense, “maximally avoiding” of each other. A conceptually simple example is Halton’s sequence [1]. In one dimension, the jth number Hj in the sequence is obtained by the following steps: (i) Write j as a number in base b, where b is some prime. (For example j = 17 in base b = 3 is 122.) (ii) Reverse the digits and put a radix point (i.e., a decimal point base b) in front of
<<向上翻页
©2008-现在 cucdc.com 高等教育资讯网 版权所有