正在加载图片...
International Statistical Review (1990).58,2,pp.153-171.Printed in Great Britain Interational Statistical Institute Maximum Likelihood:An Introduction L.Le Cam Department of Statistics,University of California,Berkeley,California 94720,USA Summary Maximum likelihood estimates are reported to be best under all circumstances.Yet there are numerous simple examples where they plainly misbehave.One gives some examples for problems that had not been invented for the purpose of annoying maximum likelihood fans.Another example,imitated from Bahadur,has been specially created with just such a purpose in mind.Next, we present a list of principles leading to the construction of good estimates.The main principle says that one should not believe in principles but study each problem for its own sake. Key words:Estimation;Maximum likelihood;One-step approximations. 1 Introduction One of the most widely used methods of statistical estimation is that of maximum likelihood.Opinions on who was the first to propose the method differ.However Fisher is usually credited with the invention of the name 'maximum likelihood',with a major effort intended to spread its use and with the derivation of the optimality properties of the resulting estimates. Qualms about the general validity of the optimality properties have been expressed occasionally.However as late as 1970 L.J.Savage could imply in his 'Fisher lecture'that the difficulties arising in some examples would have rightly been considered 'mathemati- cal caviling'by R.A.Fisher. Of course nobody has been able to prove that maximum likelihood estimates are 'best' under all circumstances.The lack of any such proof is not sufficient by itself to invalidate Fisher's claims.It might simply mean that we have not yet translated into mathematics the basic principles which underlied Fisher's intuition. The present author has,unwittingly,contributed to the confusion by writing two papers which have been interpreted by some as attempts to substantiate Fisher's claims. To clarify the situation we present a few known facts which should be kept in mind as one proceeds along through the various proofs of consistency,asymptotic normality or asymptotic optimality of maximum likelihood estimates. The examples given here deal mostly with the case of independent identically distributed observations.They are intended to show that maximum likelihood does possess disquieting features which rule out the possibility of existence of undiscovered underlying principles which could be used to justify it.One of the very gross forms of misbehavior can be stated as follows. Maximum likelihood estimates computed with all the information available may turn out to be inconsistent.Throwing away a substantial part of the information may render them consistent. The examples show that,in spite of all its presumed virtues,the maximum likelihood procedure cannot be universally recommended.This does not mean that we advocateInternational Statistical Review (1990), 58, 2, pp. 153-171. Printed in Great Britain ? International Statistical Institute Maximum Likelihood: An Introduction L. Le Cam Department of Statistics, University of California, Berkeley, California 94720, USA Summary Maximnm likelihood estimates are reported to be best under all circumstances. Yet there are numerous simple examples where they plainly misbehave. One gives some eranmples for problems that had not been invented for the purpose of annoying ms,aximunm likelihood fans. Another example, imitated from B'hadu'r, has been specially created with just such a purpose in mind. Next, we present a list of principles leading to the construction of good estimates. The main principle says that one should not believe in principles but study each problem for its own sake. Key words: Estimation; Maximum likelihood; One-step approximations. 1 Introduction One of the most widely used methods of statistical estimation is that of maximum likelihood. Opinions on who was the first to propose the method differ. However Fisher is usually credited with the invention of the name 'maximum likelihood', with a major effort intended to spread its use and with the derivation of the optimality properties of the resulting estimates. Qualms about the general validity of the optimality properties have been expressed occasionally. However as late as 1970 L.J. Savage could imply in his 'Fisher lecture' that the difficulties arising in some examples would have rightly been considered 'mathemati￾cal caviling' by R.A. Fisher. Of course nobody has been able to prove that maximum likelihood estimates are 'best' under all circumstances. The lack of any such proof is not sufficient by itself to invalidate Fisher's claims. It might simply mean that we have not yet translated into mathematics the basic principles which underlied Fisher's intuition. The present author has, unwittingly, contributed to the confusion by writing two papers which have been interpreted by some as attempts to substantiate Fisher's claims. To clarify the situation we present a few known facts which should be kept in mind as one proceeds along through the various proofs of consistency, asymptotic normality or asymptotic optimality of maximum likelihood estimates. The examples given here deal mostly with the case of independent identically distributed observations. They are intended to show that maximum likelihood does possess disquieting features which rule out the possibility of existence of undiscovered underlying principles which could be used to justify it. One of the very gross forms of misbehavior can be stated as follows. Maximum likelihood estimates computed with all the information available may turn out to be inconsistent. Throwing away a substantial part of the information may render them consistent. The examples show that, in spite of all its presumed virtues, the maximum likelihood procedure cannot be universally recommended. This does not mean that we advocate
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有