to be able to tell what the value of X(w) must be As a further illustration of the fact that o-algebras do a good job in modelling information, we have the following result Definition. Let IXa, aeI be a family of random variables. Then the a-algebra generated by (Xa,CEIl, denoted by o(Xa, aE I is the smallest a-algebra g such that all the random variables in Xa, aE I are g-measurable Remark. Such a a-algebra exists. (Recall the proof: consider the intersection of all o-algebras on Q such that (Xa, cE I are measurable. Proposition. Let X=X1, X2, . Xn be a finite set of random variables. Let Z be a random variable. Then Z is o[X]-measurable iff there exists a Borel measurable function f:Rn→ R such that, for all w∈9, Z(u)=f(X1(u),X2(u),…,Xn(u) Proof. The case when oX is generated by a finite partition (i.e. when the mapping T:0-Rn defined via T(w)=(X1, X2,. Xn) is F-simple) is not too hard and is left as an exercise. For the rest, see Williams(1991).D 2 The conditional expectation Intuitively, the conditional expectation is the best predictor of the realization of a random variable given the available information. by "best" we will mean the one that minimizes the mean square errorto be able to tell what the value of X (ω) must be. As a further illustration of the fact that σ-algebras do a good job in modelling information, we have the following result. Definition. Let {Xα, α ∈ I} be a family of random variables. Then the σ-algebra generated by {Xα, α ∈ I}, denoted by σ {Xα, α ∈ I} is the smallest σ-algebra G such that all the random variables in {Xα, α ∈ I} are G-measurable. Remark. Such a σ-algebra exists. (Recall the proof: consider the intersection of all σ-algebras on Ω such that {Xα, α ∈ I} are measurable.) Proposition. Let X = {X1,X2, ..., Xn} be a finite set of random variables. Let Z be a random variable. Then Z is σ {X}-measurable iff there exists a Borel measurable function f : R n → R such that, for all ω ∈ Ω, Z (ω) = f (X1 (ω), X2 (ω), ..., Xn (ω)). (2) Proof. The case when σ {X} is generated by a finite partition (i.e. when the mapping T : Ω → R n defined via T (ω) = (X1,X2, ..., Xn) is F-simple) is not too hard and is left as an exercise. For the rest, see Williams (1991). 2 The conditional expectation Intuitively, the conditional expectation is the best predictor of the realization of a random variable given the available information. By “best” we will mean the one that minimizes the mean square error. 5