正在加载图片...
Maximum Likelihood 159 Taking for each p&the densities f used previously,we obtain similarly densities f(x,)=[1-u(6-k)]f(x)+u(0-k)f+1(x): The function u can be constructed,for instance,by taking a multiple of the indefinite integral of the function ep{[+] for te[0,1)and zero otherwise.If so f(x,6)is certainly infinitely differentiable in 0. Also the integral ff(x,)dx can be differentiated infinitely under the integral sign.There is a slight annoyance that at all integer values of e all the derivatives vanish.To cure this take a=10-101"and let 8(x,)=[f(x,)+f(x,日+e-1 Then,certainly,everything is under control and the famous conditions in Cramer's text are all duly satisfied.Furthermore,0'implies Jl8x,6)-8x,0"1dc>0. In spite of all this,whatever may be the true value 6o,the maximum likelihood estimate still tends almost surely to infinity. Let us return to the initial example with measures p&,k=1,2,...,and let us waste some information.Having observed X1,...,X,according to one of the p&take independent identically distributed N(0,10)variables Yi,...,Y and consider V= Xi+Yi for j=1,2,...,n. Certainly one who observes V,j=1,...,n,instead of Xi,i=1,...,n,must be at a gross disadvantage! Maximum likelihood estimates do not really think so. The densities of the new variables V are functions,say va,defined,positive analytic, etc.on the whole line R=(-,+)They still are all different.In other words Ipx(x)-9,(x川d>0(k≠): Compute the maximum likelihood estimate=(v1,...,v)for these new observa- tions.We claim that pln(y,…,V)=j-→1 asn→oo. To prove this let o=103 and note that (v)is a moderately small distortion of the function 1e-(v-/o)d+(1-c) )=c。N2而 1 e-(v-aj)21(202) V(2π) Furthermore,as m the function v(v)converges pointwise to f_1eo-smd5+(1-c)。 .)=c。N2m 1e22. V(2π) Thus,we can compactify the set =(1,2,...}by addition of a point at infinity with (v)as described above.Maximum Likelihood Taking for each Pk the densities fk used previously, we obtain similarly densities f(x, 0) = [1 - u( - k)lfk(x) + u(O - k)fk+l(X). The function u can be constructed, for instance, by taking a multiple of the indefinite integral of the function {-[ 1 for t E [0, 1) and zero otherwise. If so f(x, 0) is certainly infinitely differentiable in 0. Also the integral ff(x, 0) dx can be differentiated infinitely under the integral sign. There is a slight annoyance that at all integer values of 0 all the derivatives vanish. To cure this take a = 1010137 and let g(x, 0) = l[f(x, 0) +f(x, 0 + ce-4)]. Then, certainly, everything is under control and the famous conditions in Cramer's text are all duly satisfied. Furthermore, 0 6O' implies Ig(x, 0)-g(x, 0') dx >0. In spite of all this, whatever may be the true value O0, the maximum likelihood estimate still tends almost surely to infinity. Let us return to the initial example with measures Pk, k = 1, 2,..., and let us waste some information. Having observed X1,... , Xn, according to one of the Pk take independent identically distributed N(0, 106) variables Yl,..., Yn and consider Vj= Xj + Yj for j= 1, 2, ..., n. Certainly one who observes Vj, j = 1,..., n, instead of Xj, i = 1,... , n, must be at a gross disadvantage! Maximum likelihood estimates do not really think so. The densities of the new variables Vj are functions, say IPk, defined, positive analytic, etc. on the whole line R = (-oo, +oo). They still are all different. In other words I Ik(x)- j(x)l dx >0 (k j). Compute the maximum likelihood estimate On = n(v1,..., Vn) for these new observa￾tions. We claim that pj[O (V1 ..., Vn) =j]- 1 as n - oo. To prove this let a = 103 and note that ipj(v) is a moderately small distortion of the function i(v) = c a e(2v) 2f(2a2) d_ + (1 - C) ( (ve-i)2/(2a2) or +V(2r) oV/(2sr) Furthermore, as m -- oo the function Pm(v) converges pointwise to .1 1 1 -(v) = c e a((2-) e 2( d +1 - (2) ev2/(222) orN(2.1r)'a~ + ( c)o/(2r) Thus, we can compactify the set = {1, 2, .. .} by addition of a point at infinity with t~oo(v) as described above. 159
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有