正在加载图片...
a sequence of matrices It such that H+1=4x (2 Then X is said to be a finite-state Markov chain in discrete time. Call It the probability transition matrix The interpretation of (21) is the following P(Xu+1=iXt=si)=Tt(i,j) If Tt=r we call the process and time-homogenenous or stationary If r is sufficiently well-behaved, then X has a unique stationary distribution Definition. Let X be a stationary finite-state Markov chain and let T be the time of the first visit to state j after t=0. Then state j is called recurrent(opposite transient) if P({T<∞} Definition. The i is called periodic with period 8> 1 if 8 is the largest integer P(T=n8 for some n 1X0=j=1 If there is no such 8>1, then i is called aperiodic Definition. A Markov chain is said to be aperiodic if all its states are aperiodic Definition. A state j can be reached from i if there exists an integer n >0 such that r"(i,j)>0.a sequence of matrices Γt such that µt+1 = Γtµt . (21) Then X is said to be a finite–state Markov chain in discrete time. Call Γt the probability transition matrix. The interpretation of (21) is the following. P(Xt+1 = xi |Xt = xj ) = Γt(i, j). If Γt = Γ we call the process and time–homogenenous or stationary. If Γ is sufficiently well–behaved, then X has a unique stationary distribution. Definition. Let X be a stationary finite–state Markov chain and let T be the time of the first visit to state j after t = 0. Then state j is called recurrent (opposite: transient) if P({T < ∞}|X0 = xj ) = 1. Definition. The j is called periodic with period δ > 1 if δ is the largest integer for which P({T = nδ for some n ≥ 1}|X0 = j) = 1. If there is no such δ > 1, then j is called aperiodic. Definition. A Markov chain is said to be aperiodic if all its states are aperiodic. Definition. A state j can be reached from i if there exists an integer n ≥ 0 such that Γ n (i, j) > 0. 16
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有