A FIRST COURSE IN PROBABILITY Eighth Edition Sheldon Ross University of Southern California PEARSON Upper Saddle River,New Jersey 07458
A FIRST COURSE IN PROBABILITY Eighth Edition Sheldon Ross University of Southern California Upper Saddle River, New Jersey 07458
Contents Preface 1 Combinatorial Analysis 12 12 13 14 Combinations 6 efficients 591215 Problems ercise 6180 2 iomsofProbabilty tion 2. Sample Space and Events ome Simple Pr 93 2.6 Probability as a Continuous Set Function 44 2.7 Probability as a Measure of Belief. 8190 Theoretical exercises 4 Self-Test Problems and Exercises 56 3 Conditional Probability and Independence Probabilities 33 Baves's Formula 65 3.4 3.5 。, 990 2 4 Random Variables 42 dom Variables 123 43 Expected Value Epcation ofa Function of Random Variable 125 The 34 4.6.1 Properties of Binomial Random Variables. 1 4.6.2 Computing the Binomial Distribution Function. 142
Contents Preface xi 1 Combinatorial Analysis 1 1.1 Introduction . . . . 1 1.2 The Basic Principle of Counting . . . . 1 1.3 Permutations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Combinations . . . 5 1.5 Multinomial Coefficients . . . . 9 1.6 The Number of Integer Solutions of Equations . . . . . . . . . . . . . 12 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 20 2 Axioms of Probability 22 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.2 Sample Space and Events . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.3 Axioms of Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.4 Some Simple Propositions . . . . . . . . . . . . . . . . . . . . . . . . . 29 2.5 Sample Spaces Having Equally Likely Outcomes . . . . . . . . . . . . 33 2.6 Probability as a Continuous Set Function . . . . . . . . . . . . . . . . . 44 2.7 Probability as a Measure of Belief . . . . . . . . . . . . . . . . . . . . . 48 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 56 3 Conditional Probability and Independence 58 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.2 Conditional Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.3 Bayes’s Formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.4 Independent Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 3.5 P(·|F) Is a Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 114 4 Random Variables 117 4.1 Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 4.2 Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . 123 4.3 Expected Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 4.4 Expectation of a Function of a Random Variable . . . . . . . . . . . . 128 4.5 Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 4.6 The Bernoulli and Binomial Random Variables . . . . . . . . . . . . . 134 4.6.1 Properties of Binomial Random Variables . . . . . . . . . . . . 139 4.6.2 Computing the Binomial Distribution Function . . . . . . . . . 142 vii
viii Contents 4.7 The Poisson Random Variable 143 471 ing the Poisson Distribution Function 154 4.8 Other Discrete Probability Distributions. 48 The Geometric Random varable Variable 483 cometric Random Variable 555 48.4 163 170 Problems 1 ercise 13 5 Continuous Random Variables 186 2 190 5.3 The Uniform Random Variable 194 5.4 al Rormal 288 Appro 5.Rate Functions 212 5.6 Other Continuous Distributions 。, 215 。 5.63 The cauchy distribution 217 5.6.4 The Beta Distribution 5.7 The Distribution of a Function of a Random Variable Problems 224 xercises 。 227 229 6 Jointly Distributed Random Variables 232 61 6 ndent Random y 252 6.3.1 Identically Distributed Uniform Random Variables 252 633 Gamma 34 Poormal R Random Variables 25 ial D m Variables 25d 6.3.5 Geometric Random Variables 260 Conditional Distributions:Discrete Case istributions:Continuous Case. 6.7 Joint Probability Distribution of Functions of Random Variables 274 6.8 Exchangeable Random Variables. pummary. Theoretical Exercises Self-Test Problems and Exercises
viii Contents 4.7 The Poisson Random Variable . . . . . . . . . . . . . . . . . . . . . . . 143 4.7.1 Computing the Poisson Distribution Function . . . . . . . . . . 154 4.8 Other Discrete Probability Distributions . . . . . . . . . . . . . . . . . 155 4.8.1 The Geometric Random Variable . . . . . . . . . . . . . . . . . 155 4.8.2 The Negative Binomial Random Variable . . . . . . . . . . . . 157 4.8.3 The Hypergeometric Random Variable . . . . . . . . . . . . . 160 4.8.4 The Zeta (or Zipf) Distribution . . . . . . . . . . . . . . . . . . 163 4.9 Expected Value of Sums of Random Variables . . . . . . . . . . . . . 164 4.10 Properties of the Cumulative Distribution Function . . . . . . . . . . . 168 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 183 5 Continuous Random Variables 186 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186 5.2 Expectation and Variance of Continuous Random Variables . . . . . 190 5.3 The Uniform Random Variable . . . . . . . . . . . . . . . . . . . . . . 194 5.4 Normal Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . 198 5.4.1 The Normal Approximation to the Binomial Distribution . . . 204 5.5 Exponential Random Variables . . . . . . . . . . . . . . . . . . . . . . 208 5.5.1 Hazard Rate Functions . . . . . . . . . . . . . . . . . . . . . . . 212 5.6 Other Continuous Distributions . . . . . . . . . . . . . . . . . . . . . . 215 5.6.1 The Gamma Distribution . . . . . . . . . . . . . . . . . . . . . 215 5.6.2 The Weibull Distribution . . . . . . . . . . . . . . . . . . . . . 216 5.6.3 The Cauchy Distribution . . . . . . . . . . . . . . . . . . . . . . 217 5.6.4 The Beta Distribution . . . . . . . . . . . . . . . . . . . . . . . 218 5.7 The Distribution of a Function of a Random Variable . . . . . . . . . 219 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 229 6 Jointly Distributed Random Variables 232 6.1 Joint Distribution Functions . . . . . . . . . . . . . . . . . . . . . . . . 232 6.2 Independent Random Variables . . . . . . . . . . . . . . . . . . . . . . 240 6.3 Sums of Independent Random Variables . . . . . . . . . . . . . . . . . 252 6.3.1 Identically Distributed Uniform Random Variables . . . . . . 252 6.3.2 Gamma Random Variables . . . . . . . . . . . . . . . . . . . . 254 6.3.3 Normal Random Variables . . . . . . . . . . . . . . . . . . . . 256 6.3.4 Poisson and Binomial Random Variables . . . . . . . . . . . . 259 6.3.5 Geometric Random Variables . . . . . . . . . . . . . . . . . . . 260 6.4 Conditional Distributions: Discrete Case . . . . . . . . . . . . . . . . . 263 6.5 Conditional Distributions: Continuous Case . . . . . . . . . . . . . . . 266 6.6 Order Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270 6.7 Joint Probability Distribution of Functions of Random Variables . . . 274 6.8 Exchangeable Random Variables . . . . . . . . . . . . . . . . . . . . . 282 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 293
Contents ix 7 Properties of Expectation 297 7.1 Introduction 297 7.2 ahePobablstiewetogectation 311 7.2.2 The Maximum-Minimums Identity 313 Moments of the Number of Events that Occur.,·,·,315 7.5 7.5.1 Definitions 7.5.2 Computing Expectations by Conditioning Comby oditionn 34 ati Moment Generating Functions. 354 7.7.1 Joint Moment Generating Functions 36 7.8 Additional Properties of Norma Random Variables The Joint Distribution of the sampis Mean 36 782 and sample variance 36 7.9 General Definition of Expectation. 369 380 Self-Test Problems and Exercises 8.2 Chebyshev's Inequality and the Weak Law of Large Numbers 388 83 301 Other Ineg arge 403 8.6 Bounding the Error Probability When Approximating a Sum of Problems 412 ···414 415 9 Additional Topics in Probability 417 9.1 The Poisson Process. 417 Markov Chains 94 428 Summary Problems and Theoretical Exercises. 435 ell-lest Problems and Exercises.43o References··········
Contents ix 7 Properties of Expectation 297 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297 7.2 Expectation of Sums of Random Variables . . . . . . . . . . . . . . . . 298 7.2.1 Obtaining Bounds from Expectations via the Probabilistic Method . . . . . . . . . . . . . . . . . . . . 311 7.2.2 The Maximum–Minimums Identity . . . . . . . . . . . . . . . . 313 7.3 Moments of the Number of Events that Occur . . . . . . . . . . . . . . 315 7.4 Covariance, Variance of Sums, and Correlations . . . . . . . . . . . . . 322 7.5 Conditional Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . 331 7.5.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331 7.5.2 Computing Expectations by Conditioning . . . . . . . . . . . . 333 7.5.3 Computing Probabilities by Conditioning . . . . . . . . . . . . 344 7.5.4 Conditional Variance . . . . . . . . . . . . . . . . . . . . . . . . 347 7.6 Conditional Expectation and Prediction . . . . . . . . . . . . . . . . . 349 7.7 Moment Generating Functions . . . . . . . . . . . . . . . . . . . . . . . 354 7.7.1 Joint Moment Generating Functions . . . . . . . . . . . . . . . 363 7.8 Additional Properties of Normal Random Variables . . . . . . . . . . 365 7.8.1 The Multivariate Normal Distribution . . . . . . . . . . . . . . 365 7.8.2 The Joint Distribution of the Sample Mean and Sample Variance . . . . . . . . . . . . . . . . . . . . . . . . 367 7.9 General Definition of Expectation . . . . . . . . . . . . . . . . . . . . . 369 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 380 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 384 8 Limit Theorems 388 8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 8.2 Chebyshev’s Inequality and the Weak Law of Large Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388 8.3 The Central Limit Theorem . . . . . . . . . . . . . . . . . . . . . . . . 391 8.4 The Strong Law of Large Numbers . . . . . . . . . . . . . . . . . . . . 400 8.5 Other Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 8.6 Bounding the Error Probability When Approximating a Sum of Independent Bernoulli Random Variables by a Poisson Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412 Theoretical Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 415 9 Additional Topics in Probability 417 9.1 The Poisson Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417 9.2 Markov Chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 419 9.3 Surprise, Uncertainty, and Entropy . . . . . . . . . . . . . . . . . . . . 425 9.4 Coding Theory and Entropy . . . . . . . . . . . . . . . . . . . . . . . . 428 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434 Problems and Theoretical Exercises . . . . . . . . . . . . . . . . . . . . 435 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 436 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 436
x Contents 10 Simulation 438 10.2.2 The Rejection Method 442 10./mulating from Discrete Distrbutions. 10.4nG Antithe 10 Variance Reduction by Conditioning 444505 l0.4.3 Control Variates.。. 3 453 Self-Test Problems and Exercises. 4 Answers to Selected Problems 457 Solutions to Self-Test Problems and Exercises 461 Index 521
x Contents 10 Simulation 438 10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 438 10.2 General Techniques for Simulating Continuous Random Variables . . 440 10.2.1 The Inverse Transformation Method . . . . . . . . . . . . . . . 441 10.2.2 The Rejection Method . . . . . . . . . . . . . . . . . . . . . . . 442 10.3 Simulating from Discrete Distributions . . . . . . . . . . . . . . . . . . 447 10.4 Variance Reduction Techniques . . . . . . . . . . . . . . . . . . . . . . 449 10.4.1 Use of Antithetic Variables . . . . . . . . . . . . . . . . . . . . 450 10.4.2 Variance Reduction by Conditioning . . . . . . . . . . . . . . . 451 10.4.3 Control Variates . . . . . . . . . . . . . . . . . . . . . . . . . . 452 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 455 Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 455 Answers to Selected Problems 457 Solutions to Self-Test Problems and Exercises 461 Index 521
Preface on without being able to accogmt for its remarka h originate do gam he tam d onomer (hemacee and astronomer(the of th havermwhat,it is everthele true that probabilitythoryhasbecom This book is intended as an elementary introduction to the theory of probability istics,engineering,and the s ciences (including com mathematics of probabbuthroughumrous cxamples,the many ble appcth s of this subject. outing probabilities of combinatorial analysis,which are most apthadithe axioms of probability theory and shows how they can be Chaperects of condiional prohabiliy com not oniy when someo probabilities by"conditioning"reappears in Chapter 7,where we use it to obtain Chapter 5.and jointly distributed random variables in Chapter 6.The important con cepts of the expecte troduced in ofandom variable many of t Additional properties of the expected value are considered in Chapter 7.Many ness or functions are conta mp! distribution
Preface “We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it. .It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge. .The most important questions of life are, for the most part, really only problems of probability.” So said the famous French mathematician and astronomer (the “Newton of France”) PierreSimon, Marquis de Laplace. Although many people feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not “Is it so?” but rather “What is the probability that it is so?” This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, biology, the social sciences, and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject. Chapter 1 presents the basic principles of combinatorial analysis, which are most useful in computing probabilities. Chapter 2 handles the axioms of probability theory and shows how they can be applied to compute various probabilities of interest. Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples, we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by “conditioning” reappears in Chapter 7, where we use it to obtain expectations. The concept of random variables is introduced in Chapters 4, 5, and 6. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5, and these quantities are then determined for many of the common types of random variables. Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and on moment-generating functions are contained in this chapter. In addition, the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution. xi
xii Preface Chapter 8 presents the major theoretical results of probability theory.In partic- roe prove the strong law of laree number nd thean strong lawis a relatively n assumes th Levy's continuity theorem.This chapter also presents such probability inequalities as Markov's inequality,Chebyshev's he final section 01 he sponding probability of a Poisson random variable having the same expected the poi simulation. This last set of exercis for which complete solutions a r in SotionstoSeTestPoblemsamdExercises.sdesignediBheipSudemsteel comprehension and study for exams CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text.It include new problems, ercises, and text material chosen both fo its inh erent interest ane 公二aw this result when the sample space of the probability experiment napter num er or ran 6.3nm new section in which istooofpndet eome random variables with different means ACKNOWLEDGMENTS me to conta me with comme d sity of Lausaunne:Joscph Mitchell.SUNY.Stony Brook:lan Chambie Robert Kriner:Israel David,Ben-Gurion University;T.Lim,George Mason Univer
xii Preface Chapter 8 presents the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy’s continuity theorem. This chapter also presents such probability inequalities as Markov’s inequality, Chebyshev’s inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability of a Poisson random variable having the same expected value. Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation. As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Solutions to Self-Test Problems and Exercises, is designed to help students test their comprehension and study for exams. CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text. It includes new problems, exercises, and text material chosen both for its inherent interest and for its use in building student intuition about probability. Illustrative of these goals are Example 5d of Chapter 1 on knockout tournaments, and Examples 4k and 5i of Chapter 7 on multiple player gambler’s ruin problems. A key change in the current edition is that the important result that the expectation of a sum of random variables is equal to the sum of the expectations is now first presented in Chapter 4 (rather than Chapter 7 as in previous editions). A new and elementary proof of this result when the sample space of the probability experiment is finite is given in this chapter. Another change is the expansion of Section 6.3, which deals with the sum of independent random variables. Section 6.3.1 is a new section in which we derive the distribution of the sum of independent and identically distributed uniform random variables, and then use our results to show that the expected number of random numbers that needs to be added for their sum to exceed 1 is equal to e. Section 6.3.5 is a new section in which we derive the distribution of the sum of independent geometric random variables with different means. ACKNOWLEDGMENTS I am grateful for the careful work of Hossein Hamedani in checking the text for accuracy. I also appreciate the thoughtfulness of the following people that have taken the time to contact me with comments for improving the text: Amir Ardestani, Polytechnic University of Teheran; Joe Blitzstein, Harvard University; Peter Nuesch, University of Lausaunne; Joseph Mitchell, SUNY, Stony Brook; Alan Chambless, actuary; Robert Kriner; Israel David, Ben-Gurion University; T. Lim, George Mason University; Wei Chen, Rutgers; D. Monrad, University of Illinois; W. Rosenberger, George Mason University; E. Ionides, University of Michigan; J. Corvino, Lafayette College; T. Seppalainen, University of Wisconsin. Finally, I would like to thank the following reviewers for their many helpful comments. Reviewers of the eighth edition are marked with an asterisk
Acknowledgments xiii K.B.Athreva.lowa State University *Edward lonides.University of Michiga Richard Bass,University of Connecticut Anastasia Ivanova,University of North Carolina Robert Bauer,University of Illinois at Hamid Jafarkhani.University of California, Arthur Benjamin,Harvey Mudd College Chapel Hill Thomas Liggett.University of California,Los Shahar Boneh,Metropolitan State Coliege of Stony Brook an McKeague,Florida Srate Universin 4 James Clay,University of Arizona at Tucson oe Naus,Rutgers Universiry Francis Conlan.University of Snta Clara w Mexi Jay DeVo N.U.Prabhu,Co eors Un Kathryn Prewitt,Arizona State University Anant Godbole Michigan Technical osenberger,George Mason Govindar Myra Samuels Purdue University Mike Hardy,Massachusetts Institte of Technology Bernard Ha m Sherman,State University of New York David Heath.Cornell niversi SR smross@usc.edu
Acknowledgments xiii K. B. Athreya, Iowa State University Richard Bass, University of Connecticut Robert Bauer, University of Illinois at Urbana-Champaign Phillip Beckwith, Michigan Tech Arthur Benjamin, Harvey Mudd College Geoffrey Berresford, Long Island University Baidurya Bhattacharya, University of Delaware Howard Bird, St. Cloud State University Shahar Boneh, Metropolitan State College of Denver Jean Cadet, State University of New York at Stony Brook Steven Chiappari, Santa Clara University Nicolas Christou, University of California, Los Angeles James Clay, University of Arizona at Tucson Francis Conlan, University of Santa Clara *Justin Corvino, Lafayette College Jay DeVore, California Polytechnic University, San Luis Obispo Scott Emerson, University of Washington Thomas R. Fischer, Texas A & M University Anant Godbole, Michigan Technical University Zakkula Govindarajulu, University of Kentucky Richard Groeneveld, Iowa State University Mike Hardy, Massachusetts Institute of Technology Bernard Harris, University of Wisconsin Larry Harris, University of Kentucky David Heath, Cornell University Stephen Herschkorn, Rutgers University Julia L. Higle, University of Arizona Mark Huber, Duke University *Edward Ionides, University of Michigan Anastasia Ivanova, University of North Carolina Hamid Jafarkhani, University of California, Irvine Chuanshu Ji, University of North Carolina, Chapel Hill Robert Keener, University of Michigan Fred Leysieffer, Florida State University Thomas Liggett, University of California, Los Angeles Helmut Mayer, University of Georgia Bill McCormick, University of Georgia Ian McKeague, Florida State University R. Miller, Stanford University *Ditlev Monrad, University of Illinois Robb J. Muirhead, University of Michigan Joe Naus, Rutgers University Nhu Nguyen, New Mexico State University Ellen O’Brien, George Mason University N. U. Prabhu, Cornell University Kathryn Prewitt, Arizona State University Jim Propp, University of Wisconsin *William F. Rosenberger, George Mason University Myra Samuels, Purdue University I. R. Savage, Yale University Art Schwartz, University of Michigan at Ann Arbor Therese Shelton, Southwestern University Malcolm Sherman, State University of New York at Albany Murad Taqqu, Boston University Eli Upfal, Brown University Ed Wheeler, University of Tennessee Allen Webster, Bradley University S. R. smross@usc.edu
CHAPTER1 Combinatorial Analysis 1.1 INTRODUCTION PERMUTATIONS ICIPLE OF COUNTING 1.4 COMBINATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability:A communication system then boabe to re oult that cXactly m of the n sonal?For instance.in the special case where4 and sSibnct 0110 0101 1100 rsthat the nt orking and that it is defcctive Because n the 3 arrangements and not fu the functional in a similar fashion.That is,we could count the number of configurations thsystem's being functional and then divide by the total number of a omhepreceiscuesethtt oul bev method for counting the number of ways that things can occur.In fact,many prob. counting is formally 1.2 THE BASIC PRINCIPLE OF COUNTING The basic principle of counting will be fundamental to all our work.Loosely put,it
CHAPTER 1 Combinatorial Analysis 1.1 INTRODUCTION 1.2 THE BASIC PRINCIPLE OF COUNTING 1.3 PERMUTATIONS 1.4 COMBINATIONS 1.5 MULTINOMIAL COEFFICIENTS 1.6 THE NUMBER OF INTEGER SOLUTIONS OF EQUATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability: A communication system is to consist of n seemingly identical antennas that are to be lined up in a linear order. The resulting system will then be able to receive all incoming signals—and will be called functional—as long as no two consecutive antennas are defective. If it turns out that exactly m of the n antennas are defective, what is the probability that the resulting system will be functional? For instance, in the special case where n = 4 and m = 2, there are 6 possible system configurations, namely, 0110 0101 1010 0011 1001 1100 where 1 means that the antenna is working and 0 that it is defective. Because the resulting system will be functional in the first 3 arrangements and not functional in the remaining 3, it seems reasonable to take 3 6 = 1 2 as the desired probability. In the case of general n and m, we could compute the probability that the system is functional in a similar fashion. That is, we could count the number of configurations that result in the system’s being functional and then divide by the total number of all possible configurations. From the preceding discussion, we see that it would be useful to have an effective method for counting the number of ways that things can occur. In fact, many problems in probability theory can be solved simply by counting the number of different ways that a certain event can occur. The mathematical theory of counting is formally known as combinatorial analysis. 1.2 THE BASIC PRINCIPLE OF COUNTING The basic principle of counting will be fundamental to all our work. Loosely put, it states that if one experiment can result in any of m possible outcomes and if another experiment can result in any of n possible outcomes, then there are mn possible outcomes of the two experiments. 1
2 Chapter 1 Combinatorial Analysis The basic principle of counting Suppose tha n any one c 2 the comes of the two experiments. Proof of the Basic Prin The basic principle may be proven by enumerating all 4,1).1,2,1,n (2,1),(2,2),(2,n (m,1,m,2,0m,m) mcnsiss ofcc mpvhe EXAMPLE 2a different choices are possible? Solution.By regarding the choice of the woman as the outcome of the first experi ment and th one or r n as t choices. When there are more than two experiments to be performed,the basic principle can be generalized The generalized basic principle of counting If r experiments that are to be performed are such that the first one may result in rcachofhcndiliotcachotthe ssible ou f the third xperimentheroPossib outcmf r experiments. EXAMPLE 2b A college planning committee consists of 3 freshmen,4 sophomores,5 juniors,and 2
2 Chapter 1 Combinatorial Analysis The basic principle of counting Suppose that two experiments are to be performed. Then if experiment 1 can result in any one of m possible outcomes and if, for each outcome of experiment 1, there are n possible outcomes of experiment 2, then together there are mn possible outcomes of the two experiments. Proof of the Basic Principle: The basic principle may be proven by enumerating all the possible outcomes of the two experiments; that is, (1, 1), (1, 2), . , (1, n) (2, 1), (2, 2), . , (2, n) # # # (m, 1), (m, 2), . , (m, n) where we say that the outcome is (i, j) if experiment 1 results in its ith possible outcome and experiment 2 then results in its jth possible outcome. Hence, the set of possible outcomes consists of m rows, each containing n elements. This proves the result. EXAMPLE 2a A small community consists of 10 women, each of whom has 3 children. If one woman and one of her children are to be chosen as mother and child of the year, how many different choices are possible? Solution. By regarding the choice of the woman as the outcome of the first experiment and the subsequent choice of one of her children as the outcome of the second experiment, we see from the basic principle that there are 10 * 3 = 30 possible choices. . When there are more than two experiments to be performed, the basic principle can be generalized. The generalized basic principle of counting If r experiments that are to be performed are such that the first one may result in any of n1 possible outcomes; and if, for each of these n1 possible outcomes, there are n2 possible outcomes of the second experiment; and if, for each of the possible outcomes of the first two experiments, there are n3 possible outcomes of the third experiment; and if ., then there is a total of n1 · n2 ··· nr possible outcomes of the r experiments. EXAMPLE 2b A college planning committee consists of 3 freshmen, 4 sophomores, 5 juniors, and 2 seniors. A subcommittee of 4, consisting of 1 person from each class, is to be chosen. How many different subcommittees are possible?