正在加载图片...
Markov chains Definition A sequence X, X2,... of random elements of some set is a markov chain if the conditional distribution of Xn+1 given X1,..., Xn depends on Xn only P(X +141,,4 )=P(n+1Xn) State space s: the set in which the Xi take values -Transition probabilities: the conditional distribution of Xn+1 given Xn p=P(Kn+1=x|Xn=x1)=1…,n, j=1,…,n( S finite) Stationary transition probabilities transition probabilities does not depend on n Initial distribution: the marginal distribution of X1Markov Chains • Definition – A sequence 𝑋1, 𝑋2, ⋯ of random elements of some set is a Markov chain if the conditional distribution of 𝑋𝑛+1 given 𝑋1, ⋯ , 𝑋𝑛 depends on 𝑋𝑛 only. 𝑃 𝑋𝑛+1 𝑋1, ⋯ , 𝑋𝑛 = 𝑃(𝑋𝑛+1|𝑋𝑛) – State space 𝑆: the set in which the 𝑋𝑖 take values – Transition probabilities: the conditional distribution of 𝑋𝑛+1 given 𝑋𝑛 𝑝𝑖𝑗 = 𝑃 𝑋𝑛+1 = 𝑥𝑗 𝑋𝑛 = 𝑥𝑖 , 𝑖 = 1, ⋯ , 𝑛, 𝑗 = 1, ⋯ , 𝑛 (𝑆 finite) Stationary transition probabilities: transition probabilities does not depend on 𝑛. – Initial distribution: the marginal distribution of 𝑋1
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有