A First Course in PROBABILITY 四1月9 10 291 SHELDON ROSS
A FIRST COURSE IN PROBABILITY Eighth Edition Sheldon Ross University of Southern California Prentice Hall is an imprint of PEARSON Upper Saddle River,New Jersey 07458
A FIRST COURSE IN PROBABILITY Eighth Edition Sheldon Ross University of Southern California Upper Saddle River, New Jersey 07458
Contents Preface 1 Combinatorial Analysis 11 Introduction 12 The Basic Principle of Counting 13 Permutations 1.4 Combinations 1.5 Multinomial Coefficients 1.6 The Number of Integer Solutions of Equations 912 Summary 。。 Problem Theoretical Exercises Self-Test Problems and Exercises 6180 2 Axioms of Probability 2 2.1 Introduction 2 22 Sample Space and Events 2.3 Axioms of Probability 6 24 Some Simple Pronositions 2.5 Sample Spaces Having Equally Likely Outcomes 3 2.6 Probability as a continuous Set Function 2.7 Probability as a Measure of Belief Summarv 9 Problems Theoretical Exercises Self-Test Problems and Exercises 56 3 Conditional Probability and Independence 5 31 Introduction 58 3 Conditional Probabilities 58 33 Baves's Formula 6 34 Independent Events 3.5 P(-F)Is a Probability Problems 102 Theoretical Exercises 110 Self-Test Problems and Exercises 114 4 Random Variables Discrete Ra m 123 Va Variables 125 of a Function of a Random Variable 128 A 132 46 The B noulli and Binomial Random variables 124 461 Properties of Binomial Random Variables 120 4.6.2 mputing the Binomial Distribution Function 142 i
Contents Preface xi 1 Combinatorial Analysis 1 1.1 Introduction . . . .............................. 1 1.2 The Basic Principle of Counting . . . ................... 1 1.3 Permutations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Combinations . . .............................. 5 1.5 Multinomial Coefficients . . . ....................... 9 1.6 The Number of Integer Solutions of Equations . . . . . . . . . . . . . 12 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 20 2 Axioms of Probability 22 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.2 Sample Space and Events . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.3 Axioms of Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.4 Some Simple Propositions . . . . . . . . . . . . . . . . . . . . . . . . . 29 2.5 Sample Spaces Having Equally Likely Outcomes . . . . . . . . . . . . 33 2.6 Probability as a Continuous Set Function . . . . . . . . . . . . . . . . . 44 2.7 Probability as a Measure of Belief . . . . . . . . . . . . . . . . . . . . . 48 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 56 3 Conditional Probability and Independence 58 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.2 Conditional Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3 Bayes’s Formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.4 Independent Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 3.5 P(·|F) Is a Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 114 4 Random Variables 117 4.1 Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 4.2 Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . 123 4.3 Expected Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 4.4 Expectation of a Function of a Random Variable . . . . . . . . . . . . 128 4.5 Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 4.6 The Bernoulli and Binomial Random Variables . . . . . . . . . . . . . 134 4.6.1 Properties of Binomial Random Variables . . . . . . . . . . . . 139 4.6.2 Computing the Binomial Distribution Function . . . . . . . . . 142 vii
viii Contents 4.7 The Poisson Random Variable 143 4.8 4. 15 Oth Domputing the Poisson Distribution Function.......... screte Prob 。。 482 Variabl 483 egative Bino Hyperge om Variable a pt)Dis 163 4.9 Expecte stribution Function 16d 172 cal Ex 70 Self-Test proh and Exercises 183 186 51 Introduction 186 52 Expectation and Variance of Continuous Random Variables 190 53 The Uniform random variable 194 5.4 Normal Random Variables 198 5.4.1 The Normal Approximation to the Binomial Distribution 204 55 Exponential Random Variables.......... 208 55.1 Hazard rate functions 212 5.6 Other Continuous Distributions 215 5.6.1 The Gamma Distribution 。。 215 5.6. The Weibull Distribution 5.6.3 I he Cauchy Distribution 5.6.4 The Beta Distribution 5.7 The Distribution of a Function of a Random Variable Summary Problems al Exercises 。。 t Problems and Exercises 。。- 229 6 511 61 025 6.2 joint Distribution F t Ra Va 240 6.3 endent Random Variables 252 631 Identically Distributed Uniform Random Variables 253 632 Gamma Random Variables 254 633 Normal random variables 256 6.3.4 Poisson and Binomial Random Variables 259 635 Geometric Random variables 260 6 Conditional Distributions:Discrete case 263 65 Conditional Distributions:Continuous Case 266 6.6 Order Statistics 270 6.7 Joint Probability Distribution of Functions of Random Variables 274 6.8 Exchangeable Random Variables 282 Summary 285 Problems 287 Theoretical Exercises 291 Self-Test Problems and Exercises.......... 293
viii Contents 4.7 The Poisson Random Variable . . . . . . . . . . . . . . . . . . . . . . . 143 4.7.1 Computing the Poisson Distribution Function . . . . . . . . . . 154 4.8 Other Discrete Probability Distributions . . . . . . . . . . . . . . . . . 155 4.8.1 The Geometric Random Variable . . . . . . . . . . . . . . . . . 155 4.8.2 The Negative Binomial Random Variable . . . . . . . . . . . . 157 4.8.3 The Hypergeometric Random Variable . . . . . . . . . . . . . 160 4.8.4 The Zeta (or Zipf) Distribution . . . . . . . . . . . . . . . . . . 163 4.9 Expected Value of Sums of Random Variables . . . . . . . . . . . . . 164 4.10 Properties of the Cumulative Distribution Function . . . . . . . . . . . 168 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 183 5 Continuous Random Variables 186 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186 5.2 Expectation and Variance of Continuous Random Variables . . . . . 190 5.3 The Uniform Random Variable . . . . . . . . . . . . . . . . . . . . . . 194 5.4 Normal Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . 198 5.4.1 The Normal Approximation to the Binomial Distribution . . . 204 5.5 Exponential Random Variables . . . . . . . . . . . . . . . . . . . . . . 208 5.5.1 Hazard Rate Functions . . . . . . . . . . . . . . . . . . . . . . . 212 5.6 Other Continuous Distributions . . . . . . . . . . . . . . . . . . . . . . 215 5.6.1 The Gamma Distribution . . . . . . . . . . . . . . . . . . . . . 215 5.6.2 The Weibull Distribution . . . . . . . . . . . . . . . . . . . . . 216 5.6.3 The Cauchy Distribution . . . . . . . . . . . . . . . . . . . . . . 217 5.6.4 The Beta Distribution . . . . . . . . . . . . . . . . . . . . . . . 218 5.7 The Distribution of a Function of a Random Variable . . . . . . . . . 219 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 229 6 Jointly Distributed Random Variables 232 6.1 Joint Distribution Functions . . . . . . . . . . . . . . . . . . . . . . . . 232 6.2 Independent Random Variables . . . . . . . . . . . . . . . . . . . . . . 240 6.3 Sums of Independent Random Variables . . . . . . . . . . . . . . . . . 252 6.3.1 Identically Distributed Uniform Random Variables . . . . . . 252 6.3.2 Gamma Random Variables . . . . . . . . . . . . . . . . . . . . 254 6.3.3 Normal Random Variables . . . . . . . . . . . . . . . . . . . . 256 6.3.4 Poisson and Binomial Random Variables . . . . . . . . . . . . 259 6.3.5 Geometric Random Variables . . . . . . . . . . . . . . . . . . . 260 6.4 Conditional Distributions: Discrete Case . . . . . . . . . . . . . . . . . 263 6.5 Conditional Distributions: Continuous Case . . . . . . . . . . . . . . . 266 6.6 Order Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270 6.7 Joint Probability Distribution of Functions of Random Variables . . . 274 6.8 Exchangeable Random Variables . . . . . . . . . . . . . . . . . . . . . 282 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 293
Contents ix 7 297 207 72 Expectation of Sums of Random Variables 298 721 g Bounds from Expectations via the Probabilistic Method 311 7)) The Maxi um-Minimums Identitv 313 7.3 Moments of the Number of Events that Occur 315 Covariance.Variance of Sums.and Correlations 202 7.5 331 751 231 752 Compun Expectations 333 753 babilities by Conditioning 24A 75A nal varia 347 16 Conditional Exp nd prediction 34g 77 Moment Gene 254 771 363 7.8 Additional Pr 7.8.1 ies of Nerating Fur m Variables 365 365 782 The Join of the le Me nd S 367 7.9 360 370 Probl 373 al Ex 3R0 nd Ex 384 8 Limit The rem 388 81 388 she Inequality and the Weak Law of Large The tral Limit Th 201 The Str of I 400 85 403 the er D. at Ind pe aiabhePpainasumod 410 412 A15 Theoretical exercises 414 Self-Test Proble and Exer 415 9 Additional Topics in Probability 417 01 The Pois 417 05 Marko 419 02 prise,Un d Entropy 425 94 Co ng The nd Entropy··· 428 434 nd Theoretic E ercises 435 436 436
Contents ix 7 Properties of Expectation 297 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297 7.2 Expectation of Sums of Random Variables . . . . . . . . . . . . . . . . 298 7.2.1 Obtaining Bounds from Expectations via the Probabilistic Method . . . . . . . . . . . . . . . . . . . . 311 7.2.2 The Maximum–Minimums Identity . . . . . . . . . . . . . . . . 313 7.3 Moments of the Number of Events that Occur . . . . . . . . . . . . . . 315 7.4 Covariance, Variance of Sums, and Correlations . . . . . . . . . . . . . 322 7.5 Conditional Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . 331 7.5.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331 7.5.2 Computing Expectations by Conditioning . . . . . . . . . . . . 333 7.5.3 Computing Probabilities by Conditioning . . . . . . . . . . . . 344 7.5.4 Conditional Variance . . . . . . . . . . . . . . . . . . . . . . . . 347 7.6 Conditional Expectation and Prediction . . . . . . . . . . . . . . . . . 349 7.7 Moment Generating Functions . . . . . . . . . . . . . . . . . . . . . . . 354 7.7.1 Joint Moment Generating Functions . . . . . . . . . . . . . . . 363 7.8 Additional Properties of Normal Random Variables . . . . . . . . . . 365 7.8.1 The Multivariate Normal Distribution . . . . . . . . . . . . . . 365 7.8.2 The Joint Distribution of the Sample Mean and Sample Variance . . . . . . . . . . . . . . . . . . . . . . . . 367 7.9 General Definition of Expectation . . . . . . . . . . . . . . . . . . . . . 369 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 380 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 384 8 Limit Theorems 388 8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 8.2 Chebyshev’s Inequality and the Weak Law of Large Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 8.3 The Central Limit Theorem . . . . . . . . . . . . . . . . . . . . . . . . 391 8.4 The Strong Law of Large Numbers . . . . . . . . . . . . . . . . . . . . 400 8.5 Other Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 8.6 Bounding the Error Probability When Approximating a Sum of Independent Bernoulli Random Variables by a Poisson Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 415 9 Additional Topics in Probability 417 9.1 The Poisson Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417 9.2 Markov Chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 419 9.3 Surprise, Uncertainty, and Entropy . . . . . . . . . . . . . . . . . . . . 425 9.4 Coding Theory and Entropy . . . . . . . . . . . . . . . . . . . . . . . . 428 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434 Problems and Theoretical Exercises . . . . . . . . . . . . . . . . . . . . 435 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 436 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 436
x Contents 10 Sir ulation 438 10. Introdu ction 10.2 tinuous Random Variables.. ormation Method 0.2.2 103 butions Distributi 4M 10.4 1041 of Antithetic Va hla 450 10.4.2 Varia 10.4.3 Control Variates 45 453 Problems 453 Self-Test Problems and Exercises 455 Reference 455 Answers to Selected Problems 457 Solutions to Self-Test Problems and Exercises 461 Index 521
x Contents 10 Simulation 438 10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 438 10.2 General Techniques for Simulating Continuous Random Variables . . 440 10.2.1 The Inverse Transformation Method . . . . . . . . . . . . . . . 441 10.2.2 The Rejection Method . . . . . . . . . . . . . . . . . . . . . . . 442 10.3 Simulating from Discrete Distributions . . . . . . . . . . . . . . . . . . 447 10.4 Variance Reduction Techniques . . . . . . . . . . . . . . . . . . . . . . 449 10.4.1 Use of Antithetic Variables . . . . . . . . . . . . . . . . . . . . 450 10.4.2 Variance Reduction by Conditioning . . . . . . . . . . . . . . . 451 10.4.3 Control Variates . . . . . . . . . . . . . . . . . . . . . . . . . . 452 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 455 Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 455 Answers to Selected Problems 457 Solutions to Self-Test Problems and Exercises 461 Index 521
Preface "We see that the theory of probability is at bottom only common sense reduced to calculation;it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct,often without being able to account for it....It is remarkable that this science,which originated in the consideration of games of chance,should have become the most important object of human knowledge.. I he most importan So said tons of e are.tor the most part,really only problems of probabilty. n and astronomer (the wton of France")Pierre rquis de aplace.A who was alsc hough many people eel tha ne la of the great contr hty,migh what,it is n that proba strialists.In fact. ask not "Is it so2”but rather“What is the obability that it is so? This hook is intended a introduction to the the of for students in mathematics statistics e ering and the sciences (including c puter science,biology,the social sciences,and management science)who possess the prerequisite knowledge of elementary calculus.It attempts to present not only the mathematics of probability theory,but also,through numerous examples,the many diverse possible applications of this subject. Chapter 1 presents the basic principles of combinatorial analysis,which are most useful in com nputing probabilities. Chapter 2 handles the axioms of probability theory and shows how they can be applied to compute various probabilities of interest. Chapter 3 deals with the extremely important subjects of conditional probability pendence of events.By a se ries of examples.we il ustrate how conditional ial informati as a too more casily,even bta s by "co reappears in Chapter where we use it to obt expectations crete Chapter 5.and iointly distributed random variables in Ch er 6.Thei Chapters 4 and 5,and these quantities are then determined for many of the common types of random variables. Additional properties of the expected value are considered in Chapter 7.Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented.Sections on conditional expectation,including its use in prediction,and on moment-generating functions are contained in this chapter.In addition,the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution
Preface “We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it. ...It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge. ...The most important questions of life are, for the most part, really only problems of probability.” So said the famous French mathematician and astronomer (the “Newton of France”) PierreSimon, Marquis de Laplace. Although many people feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not “Is it so?” but rather “What is the probability that it is so?” This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, biology, the social sciences, and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject. Chapter 1 presents the basic principles of combinatorial analysis, which are most useful in computing probabilities. Chapter 2 handles the axioms of probability theory and shows how they can be applied to compute various probabilities of interest. Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples, we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by “conditioning” reappears in Chapter 7, where we use it to obtain expectations. The concept of random variables is introduced in Chapters 4, 5, and 6. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5, and these quantities are then determined for many of the common types of random variables. Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and on moment-generating functions are contained in this chapter. In addition, the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution. xi
xii Preface Chapter 8 presents the major theoretical results of probability theory.In partic ular,we prove the strong law of large numbers and the central limit theorem.Our proo of the strong law is a relativelys mple one w ch assum ne random var ourth moment, i our prool ne c entral li theorem inuity theorem.T apter al o prese s such proba apter squ equ fina th a p f inde a of a poi iable ariable ing the oy the expec Chapter 9 presents some additional topics.such as markoy chains.the poisson cess,and an introduction to information and coding theory,and Chapter 10 con simulation. As in the previous edition,three sets of exercises are given at the end of each chapter.They are designated as Problems,Theoretical Exercises,and Self-Test Prob- lems and Exercises.This last set of exercises,for which complete solutions appear in Solutions to Self-Test Problems and Exercises,is designed to help students test their comprehension and study for exams. CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text.It includes new problems,exercises,and text material chosen both for its inherent interest and for its use in building student intuition about probability.Illustrative of these goals are Example 5d of Chapter 1 on knockout tournaments,and Examples 4k and 5i of Chapter 7 on multiple player gambler's ruin problems A key change in the current edition is that the important result that the expectation of a sum of random variables is equ l to the sum of the expectations is now firs presented in Chap (rath er than napter 7 as in prev ous ed )A new and ary prool res when the sample space of he probability experiment en in this hapter her change is expansion o Section 6.3.which deals with the su n of ind and n in w we derive th th d tha bers that ds to be added for the eed tis al to new section in which we derive the distribution of the su ind ection 6.3.5 is endent gec random variables with different means ACKNOWLEDGMENTS ani in check te th the foll rime to me with ts for i the Ardestani.Polytech c University of Teheran:Joe Blitzs in Harvard iniversity:Peter nuesch iniver. sity of Lausaunne:Joseph Mitchell.SUNY.Stony Brook:Alan Chambless.actuary: Robert Kriner:Israel David,Ben-Gurion University:T.Lim,George Mason Univer- sity:Wei Chen,Rutgers:D.Monrad.University of Illinois:W.Rosenberger,George Mason University:E.Ionides,University of Michigan;J.Corvino,Lafayette College; T.Seppalainen,University of Wisconsin. Finally,I would like to thank the following reviewers for their many helpful com- ments.Reviewers of the eighth edition are marked with an asterisk
xii Preface Chapter 8 presents the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy’s continuity theorem. This chapter also presents such probability inequalities as Markov’s inequality, Chebyshev’s inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability of a Poisson random variable having the same expected value. Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation. As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Solutions to Self-Test Problems and Exercises, is designed to help students test their comprehension and study for exams. CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text. It includes new problems, exercises, and text material chosen both for its inherent interest and for its use in building student intuition about probability. Illustrative of these goals are Example 5d of Chapter 1 on knockout tournaments, and Examples 4k and 5i of Chapter 7 on multiple player gambler’s ruin problems. A key change in the current edition is that the important result that the expectation of a sum of random variables is equal to the sum of the expectations is now first presented in Chapter 4 (rather than Chapter 7 as in previous editions). A new and elementary proof of this result when the sample space of the probability experiment is finite is given in this chapter. Another change is the expansion of Section 6.3, which deals with the sum of independent random variables. Section 6.3.1 is a new section in which we derive the distribution of the sum of independent and identically distributed uniform random variables, and then use our results to show that the expected number of random numbers that needs to be added for their sum to exceed 1 is equal to e. Section 6.3.5 is a new section in which we derive the distribution of the sum of independent geometric random variables with different means. ACKNOWLEDGMENTS I am grateful for the careful work of Hossein Hamedani in checking the text for accuracy. I also appreciate the thoughtfulness of the following people that have taken the time to contact me with comments for improving the text: Amir Ardestani, Polytechnic University of Teheran; Joe Blitzstein, Harvard University; Peter Nuesch, University of Lausaunne; Joseph Mitchell, SUNY, Stony Brook; Alan Chambless, actuary; Robert Kriner; Israel David, Ben-Gurion University; T. Lim, George Mason University; Wei Chen, Rutgers; D. Monrad, University of Illinois; W. Rosenberger, George Mason University; E. Ionides, University of Michigan; J. Corvino, Lafayette College; T. Seppalainen, University of Wisconsin. Finally, I would like to thank the following reviewers for their many helpful comments. Reviewers of the eighth edition are marked with an asterisk
Acknowledgments xii K B Athr eya.lowa State University Edward Ionides,University of Michigan arolina Phillip Beckwith,Michigan Tech Chuanshu Ji.University of North Carolina, Arthur Benjamin.Harvey Mudd College Chapel Hill Geoffrey Berresford.Lo ng Island University Robert Keener,University of Michigan Baidurya Bhattacharya,University of Delaware Fred Leysieffer,Florida State University Howard Bird,St.Cloud State University Thomas Liggett,University of California,Los Shahar Boneh,Metropolitan State College of Angeles Denver Helmut Mayer,University of Georgia Jean Cadet,State University of New York at Bill McCormick,University of Georgia Stony Brook Ian McKeague,Florida State University Steven Chiappari,Santa Clara University R.Miller,Stanford Universiry Nicolas Christou.University of California.Los ev Monrad, niversity of Illinois Angeles Muirhead,University of Michigan iv f Arizona at onla Tucson ta Clara Nh Colleg en,N w Mexico State University Jay DeV U.Prab University Scott Em Univ rizon University Thomas R of Washi versity Anant Godbole,Michigan Technical Gerge Maon a Samuels,Purdue University Zakkula Govindarajulu,University of Kentucky I.R.Savage.Yale University Richard Groeneveld,lowa State University Art Schwartz,University of Michigan at Ann Mike Hardy,Massachusetts Institute of Arbor Technology Therese Shelton,Southwestern University Bernard Harris,University of Wisconsin Malcolm Sherman,State University of New York Larry Harris,University of Kentucky at Albany David Heath,Cornell University Murad Taqqu,Boston University Stephen Herschkorn,Rutgers University Eli Upfal,Brown University niversity of Arizona Ed Wheeler,University of Tennessee Allen Webster,Bradley University S.R. smrossCusc.edu
Acknowledgments xiii K. B. Athreya, Iowa State University Richard Bass, University of Connecticut Robert Bauer, University of Illinois at Urbana-Champaign Phillip Beckwith, Michigan Tech Arthur Benjamin, Harvey Mudd College Geoffrey Berresford, Long Island University Baidurya Bhattacharya, University of Delaware Howard Bird, St. Cloud State University Shahar Boneh, Metropolitan State College of Denver Jean Cadet, State University of New York at Stony Brook Steven Chiappari, Santa Clara University Nicolas Christou, University of California, Los Angeles James Clay, University of Arizona at Tucson Francis Conlan, University of Santa Clara *Justin Corvino, Lafayette College Jay DeVore, California Polytechnic University, San Luis Obispo Scott Emerson, University of Washington Thomas R. Fischer, Texas A & M University Anant Godbole, Michigan Technical University Zakkula Govindarajulu, University of Kentucky Richard Groeneveld, Iowa State University Mike Hardy, Massachusetts Institute of Technology Bernard Harris, University of Wisconsin Larry Harris, University of Kentucky David Heath, Cornell University Stephen Herschkorn, Rutgers University Julia L. Higle, University of Arizona Mark Huber, Duke University *Edward Ionides, University of Michigan Anastasia Ivanova, University of North Carolina Hamid Jafarkhani, University of California, Irvine Chuanshu Ji, University of North Carolina, Chapel Hill Robert Keener, University of Michigan Fred Leysieffer, Florida State University Thomas Liggett, University of California, Los Angeles Helmut Mayer, University of Georgia Bill McCormick, University of Georgia Ian McKeague, Florida State University R. Miller, Stanford University *Ditlev Monrad, University of Illinois Robb J. Muirhead, University of Michigan Joe Naus, Rutgers University Nhu Nguyen, New Mexico State University Ellen O’Brien, George Mason University N. U. Prabhu, Cornell University Kathryn Prewitt, Arizona State University Jim Propp, University of Wisconsin *William F. Rosenberger, George Mason University Myra Samuels, Purdue University I. R. Savage, Yale University Art Schwartz, University of Michigan at Ann Arbor Therese Shelton, Southwestern University Malcolm Sherman, State University of New York at Albany Murad Taqqu, Boston University Eli Upfal, Brown University Ed Wheeler, University of Tennessee Allen Webster, Bradley University S. R. smross@usc.edu
CHAPTER 1 Combinatorial Analysis INTRODUCTION NCIPLE OF COUNTING COMBINATIONS 15 MULTINOMIAL COEFFICIENTS 1.6 THE NUMBER OF INTEGER SOLUTIONS OF EQUATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability:A communication system s to cor will be called functio If it turn out that ex ctly m the probability tha in the special case where n=4 and m.there are6 possible system configurations.namely. 0110 1100 where 1 means that the antenna is working and 0 that it is defective.Because the resulting system will be functional in the first 3 arrangements and not functional in the remaining 3,it seems reasonable to take=as the desired probability.In the case of general n and m,we could compute the probability that the system is functional in a similar fashion.That is,we could count the number of configurations that result in the system's being functional and then divide by the total number of all possible configurations on,we see that it would be useful to have an effectiv the number of ways tha things can occur.In fact,m ems in probabil ity theory can be so d simply by counting th ways that a cert occur.The mathematical theory of counting is formally 1.2 THE BASIC PRINCIPLE OF COUNTING e that r oneeemnt anruol toomeuer experiment can result in any of n possible outcomes,then there are mn possible out- comes of the two experiments
CHAPTER 1 Combinatorial Analysis 1.1 INTRODUCTION 1.2 THE BASIC PRINCIPLE OF COUNTING 1.3 PERMUTATIONS 1.4 COMBINATIONS 1.5 MULTINOMIAL COEFFICIENTS 1.6 THE NUMBER OF INTEGER SOLUTIONS OF EQUATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability: A communication system is to consist of n seemingly identical antennas that are to be lined up in a linear order. The resulting system will then be able to receive all incoming signals—and will be called functional—as long as no two consecutive antennas are defective. If it turns out that exactly m of the n antennas are defective, what is the probability that the resulting system will be functional? For instance, in the special case where n = 4 and m = 2, there are 6 possible system configurations, namely, 0110 0101 1010 0011 1001 1100 where 1 means that the antenna is working and 0 that it is defective. Because the resulting system will be functional in the first 3 arrangements and not functional in the remaining 3, it seems reasonable to take 3 6 = 1 2 as the desired probability. In the case of general n and m, we could compute the probability that the system is functional in a similar fashion. That is, we could count the number of configurations that result in the system’s being functional and then divide by the total number of all possible configurations. From the preceding discussion, we see that it would be useful to have an effective method for counting the number of ways that things can occur. In fact, many problems in probability theory can be solved simply by counting the number of different ways that a certain event can occur. The mathematical theory of counting is formally known as combinatorial analysis. 1.2 THE BASIC PRINCIPLE OF COUNTING The basic principle of counting will be fundamental to all our work. Loosely put, it states that if one experiment can result in any of m possible outcomes and if another experiment can result in any of n possible outcomes, then there are mn possible outcomes of the two experiments. 1