正在加载图片...
4. SOME LIMIT THEORY FOR BOOTSTRAP METHODS 13 4 Some limit theory for bootstrap methods We begin again with Efron's nonparametric bootstrap.Our goal will be to show that the asymptotic behavior of the distribution of the nonparametric bootstrap estimator "mimics"the behavior of the original estimator in probability or almost surely:if we are estimating T(P)by T(Pn)and we know (perhaps by a delta method argument)that √m(T(Pn)-T(P)→aN(0,V2(P), then our goal will be to show that the bootstrap estimator satisfies v元(T(P)-T(Pn)→aN(0,V2(P))in probability or a.s. For concreteness,first consider the sample mean of a distribution P on R:if X P and EX2<oo,then for T(P)=frdP()=u(P)we know that √a(T(Pn)-T(P)=Vn(区n-u(P)→aN(0,Varp(X): The corresponding statement for the bootstrap is: Theorem 4.1 If EX2 oo,then for a.e.sequence X1,X2,..., √元(T(P)-T(Pn)=√元(n-xn)→aN(0,Var(X) Proof. Now E.Xhi =n-11Xi(w)=Xn(w),and Var.(x=∑Xo-XnoP=s号 n ✉1 It follows that 2 V元(Xt-xn(u)=∑Zni i=1 where Zni=n-12n(w)),i=1,...,n are independent,have E.Zni=0,=n1,and o2=∑1o品=S号→as.a2.Finally,fore>0,the Lindeberg condition is 62∑E.IZn1lZ>tom = gnE.h-1/2(X-xoP1xia-x>VncSJ x:(w)-X(w)P1flX-Xal>VneS.} n i=1 ≤ 1{max Xi-Xnl>evnSnh 1≤i<n a.s.0 since EX-u2<oo implies that 1 X-≤元X-川+后-→a0, Vn 1<i<n and hence the theorem follows from the Lindeberg-Feller Central Limit Theorem. The above proof is basically from Bickel and Freedman (1981).The more refined statements of the following theorem are due to Singh (1981).4. SOME LIMIT THEORY FOR BOOTSTRAP METHODS 13 4 Some limit theory for bootstrap methods We begin again with Efron’s nonparametric bootstrap. Our goal will be to show that the asymptotic behavior of the distribution of the nonparametric bootstrap estimator “mimics” the behavior of the original estimator in probability or almost surely: if we are estimating T(P) by T(Pn) and we know (perhaps by a delta method argument) that √ n(T(Pn) − T(P)) →d N(0, V 2 (P)), then our goal will be to show that the bootstrap estimator satisfies √ n(T(P ∗ n ) − T(Pn)) →d N(0, V 2 (P)) in probability or a.s.. For concreteness, first consider the sample mean of a distribution P on R: if X ∼ P and EX2 < ∞, then for T(P) = R xdP(x) ≡ µ(P) we know that √ n(T(Pn) − T(P)) = √ n(Xn − µ(P)) →d N(0, V arP (X)). The corresponding statement for the bootstrap is: Theorem 4.1 If EX2 < ∞, then for a.e. sequence X1, X2, . . ., √ n(T(P ∗ n ) − T(Pn)) = √ n(X ∗ n − Xn) →d N(0, V ar(X)). Proof. Now E∗X∗ ni = n −1 Pn i=1 Xi(ω) = Xn(ω), and V ar∗(X∗ ni) = 1 n Xn i=1 (Xi(ω) − Xn(ω))2 ≡ S 2 n . It follows that √ n(X ∗ n − Xn(ω)) = Xn i=1 Zni where Zni ≡ n −1/2 (X∗ ni − Xn(ω)), i = 1, . . . , n are independent, have E∗Zni = 0, σ 2 ni = n −1S 2 n , and σ 2 n = Pn i=1 σ 2 ni = S 2 n →a.s. σ 2 . Finally, for  > 0, the Lindeberg condition is 1 σ 2 n Xn i=1 E∗|Zni| 2 1{|Zni| > σn} = 1 S2 n nE∗|n −1/2 (X∗ n1 − Xn(ω)| 2 1{|X∗ n1 − Xn| > √ nSn} = 1 S2 n 1 n Xn i=1 |Xi(ω) − Xn(ω)| 2 1{|Xi − Xn| > √ nSn} ≤ 1{ max 1≤i≤n |Xi − Xn| > √ nSn} →a.s. 0 since E|X − µ| 2 < ∞ implies that 1 √ n max 1≤i≤n |Xi − Xn| ≤ 1 √ n max 1≤i≤n |Xi − µ| + 1 √ n |µ − Xn| →a.s. 0, and hence the theorem follows from the Lindeberg-Feller Central Limit Theorem. ✷ The above proof is basically from Bickel and Freedman (1981). The more refined statements of the following theorem are due to Singh (1981)
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有