正在加载图片...
gression (George and McCulloch 1993);spatial statistics Raftery and Banfield (1991),and longitudinal studies (Lange et al.1992). Many of these applications were forwarded though other developments such as the Adap- tive Rejection Sampling of Gilks (1992),Gilks et al.(1995),and the simulated tempering approaches of Geyer and Thompson (1995)or Neal (1996). 5 After the Revolution After the revolution comes the "second"revolution,but now we have a more mature field. The revolution has slowed,and the problems are being solved in,perhaps,deeper and more sophisticated ways (even though Gibbs sampling also offers to the amateur the possibility to handle Bayesian analysis in complex models at little cost,as exhibited by the widespread use of BUGS.But,as before,the methodology continues to expand the set of problems that statisticians can provide meaningful solutions,and thus continues to further the impact of Statistics. 5.1 A Brief Glimpse at Particle Systems The realization of the possibilities of iterating importance sampling is not new:in fact, it is about as old as Monte Carlo methods themselves!It can be found in the molecular simulation literature of the 50's,as in Hammersley and Morton (1954),Rosenbluth and Rosenbluth (1955)and Marshall(1965).Hammersley and colleagues proposed such a method to simulate a self-avoiding random walk (Madras and Slade 1993)on a grid,due to huge inefficiency in regular importance sampling and rejection techniques.Although this early implementation occurred in particle physics,the use of the term "particle"only dates back to Kitagawa(1996),while Carpenter et al.(1997)coined the term "particle filter".In signal processing,early occurrences of a "particle filter"can be traced back to Handschin and Mayne (1969). More in connection with our theme,the landmark paper Gordon et al.(1993)introduced the bootstrap filter which,while formally connected with importance sampling,involves past simulations and possible MCMC steps Gilks and Berzuini (2001).In parallel,sequential imputation was developped in Kong et al.(1994),while Liu and Chen (1995)first formally pointed out the importance of resampling in sequential Monte Carlo,a term coined by them. The more recent literature on the topic bridges the gap even further by making adaptive MCMC a possibility (see Andrieu et al.2004) 13gression (George and McCulloch 1993); spatial statistics Raftery and Banfield (1991), and longitudinal studies (Lange et al. 1992). Many of these applications were forwarded though other developments such as the Adap￾tive Rejection Sampling of Gilks (1992), Gilks et al. (1995), and the simulated tempering approaches of Geyer and Thompson (1995) or Neal (1996). 5 After the Revolution After the revolution comes the “second” revolution, but now we have a more mature field. The revolution has slowed, and the problems are being solved in, perhaps, deeper and more sophisticated ways (even though Gibbs sampling also offers to the amateur the possibility to handle Bayesian analysis in complex models at little cost, as exhibited by the widespread use of BUGS. But, as before, the methodology continues to expand the set of problems that statisticians can provide meaningful solutions, and thus continues to further the impact of Statistics. 5.1 A Brief Glimpse at Particle Systems The realization of the possibilities of iterating importance sampling is not new: in fact, it is about as old as Monte Carlo methods themselves! It can be found in the molecular simulation literature of the 50’s, as in Hammersley and Morton (1954), Rosenbluth and Rosenbluth (1955) and Marshall (1965). Hammersley and colleagues proposed such a method to simulate a self-avoiding random walk (Madras and Slade 1993) on a grid, due to huge inefficiency in regular importance sampling and rejection techniques. Although this early implementation occurred in particle physics, the use of the term “particle” only dates back to Kitagawa (1996), while Carpenter et al. (1997) coined the term “particle filter”. In signal processing, early occurrences of a “particle filter” can be traced back to Handschin and Mayne (1969). More in connection with our theme, the landmark paper Gordon et al. (1993) introduced the bootstrap filter which, while formally connected with importance sampling, involves past simulations and possible MCMC steps Gilks and Berzuini (2001). In parallel, sequential imputation was developped in Kong et al. (1994), while Liu and Chen (1995) first formally pointed out the importance of resampling in sequential Monte Carlo, a term coined by them. The more recent literature on the topic bridges the gap even further by making adaptive MCMC a possibility (see Andrieu et al. 2004). 13
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有