正在加载图片...
1.1 Probability Theory 8 Note that if A and B are independent,then P[AB]=P[A]. 1.1.6 Definition (Random variable).A real random variable2 on a pro bability (O s P)is a function x.O -R that is P.m easurable.(That is for any aR,{we:X(u)≤a}e) We can also consider random variables with other than real values:for riable In such c dom function om the bability into the P R"in onsider real ra 1.1.7 Definition.The expectation13 of a(real)random variable X is E[X]=x(w)dP(w). can be expressed as E[X]=p(w)X(w). 1.1.8 Definition (Independence of variables).Real random variables X,Y are independent if we have,for every two measurable sets A,B CR, PX∈A and Y∈B周=P[X∈APY∈B ents in the previous definition:For X and Y 塞 rm nce versa.I r al me e ts A and B:i P[K≤a and Y≤=P[K≤adPY≤ for all a.bE R.then X and Y are independent As we will check in Chapter 3.ElX+Yl=EIXI+EY]holds for anu two tandom variables (provided that the On the other hand. EXY]is generally different from E[X]E Y].But we have 1.1.9 Lemma.If X and Y are independent random variables,then EXY]=EXI·EY]1.1 Probability Theory 8 Note that if A and B are independent, then P[A|B] = P[A]. 1.1.6 Definition (Random variable). A real random variable12 on a pro￾bability space (Ω, Σ,P) is a function X: Ω → R that is P-measurable. (That is, for any a ∈ R, {ω ∈ Ω: X(ω) ≤ a} ∈ Σ.) We can also consider random variables with other than real values; for example, a random variable can have complex numbers or n-component vectors of real numbers as values. In such cases, a random variable is a measurable function from the probability space into the appropriate space with measure (complex numbers or Rn in the examples mentioned above). In this text, we will mostly consider real random variables. 1.1.7 Definition. The expectation13 of a (real) random variable X is E [X] = Z Ω X(ω) dP(ω). Any real function on a finite probability space is a random variable. Its expectation can be expressed as E [X] = X ω∈Ω p(ω)X(ω). 1.1.8 Definition (Independence of variables). Real random variables X, Y are independent if we have, for every two measurable sets A, B ⊆ R, P[X ∈ A and Y ∈ B] = P[X ∈ A] · P[Y ∈ B] . Note the shorthand notation for the events in the previous definition: For example, P[X ∈ A] stands for P[{ω ∈ Ω: X(ω) ∈ A}]. Intuitively, the independence of X and Y means that the knowledge of the value attained by X gives us no information about Y , and vice versa. In order to check independence, one need not consider all measurable sets A and B; it is sufficient to look at A = (−∞, a] and B = (−∞, b]. That is, if P[X ≤ a and Y ≤ b] = P[X ≤ a] P[Y ≤ b] for all a, b ∈ R, then X and Y are independent. As we will check in Chapter 3, E [X + Y ] = E [X] +E [Y ] holds for any two random variables (provided that the expectations exist). On the other hand, E [XY ] is generally different from E [X] E [Y ]. But we have 1.1.9 Lemma. If X and Y are independent random variables, then E [XY ] = E [X] · E [Y ] . 12random variable = náhodná proměnná 13expectation = střední hodnota!!!
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有