Mixing Why should a uniform random walk on a graph be rapidly mixing? Why should a Markov chain be rapidly mixing?
Mixing • Why should a uniform random walk on a graph be rapidly mixing? • Why should a Markov chain be rapidly mixing?
Sample Matchings G(V,E matching: MCE Ve1,e2∈M no common vertex input:G(V,E) sampling a uniform random matching in G
Sample Matchings input: G(V,E) matching: M ✓ E 8e1, e2 2 M no common vertex sampling a uniform random matching in G G(V,E)
the Jerrum-Sinclair random walk of matchings at each step:M with probability 1/2,do nothing; pick a uniform random edge e=uveE; ·(个)if none of u,v is matched,add e to M; (if eeM,remove e from M; (>if one of u,v is matched by some e'eM, replace e'by e;
• with probability 1/2, do nothing; • pick a uniform random edge e=uv∊E; • (↑) if none of u, v is matched, add e to M; (↓) if e∊M, remove e from M; (↔) if one of u, v is matched by some e’∊M, replace e’ by e; at each step: M the Jerrum-Sinclair random walk of matchings
Mixing Why should a uniform random walk on a graph be rapidly mixing? Why should a Markov chain be rapidly mixing? initial distribution g the decreasing rate of gpe-
Mixing • Why should a uniform random walk on a graph be rapidly mixing? • Why should a Markov chain be rapidly mixing? kqPt ⇡k1 initial distribution q the decreasing rate of
Spectral Decomposition Spectral Theorem P:symmetric nxn matrix eigenvalues:d≥2≥.≥λn the corresponding eigenvectors v1,v2,...,v are orthonormal m g∈Rm q=>civi where ci=qTvi i=1 qP=∑cP=∑c入 2=1 i=1
Spectral Decomposition P : eigenvalues : symmetric n×n matrix the corresponding eigenvectors v1, v2, ..., vn are orthonormal 1 2 ··· n Spectral Theorem where ci = qT vi = X n i=1 ciivi q = X n i=1 8q 2 R civi n qP = X n i=1 civiP
Mixing of Symmetric Chain =(n,P)P is symmetric stationary=(元…是 eigenvalues:1=λ1≥2≥··≥入n(Perron-Frobenius)) orthonormal eigenbasis v1,v2,...Vn gE[0,1]is a distribution=1 qz= 9= c=T+∑ Civi where ci-gTv; 2=1 2=2 qPt=πPt+∑cPt=T+caXu 2=2 2三2
Mixing of Symmetric Chain M = ([n], P) P is symmetric eigenvalues : orthonormal eigenbasis : v1, v2, ..., vn 1 = 1 2 ··· n (Perron-Frobenius) q 2 [0, 1]n is a distribution kqk1 = 1 where ci = qT q = vi X n i=1 civi 1 = 1 1P = 1 o v1 = 1 k1k2 = ✓ 1 pn ,..., 1 pn ◆ ) c1 = qT v1 = 1 pn = ⇡ +X n i=2 civi c1v1 = ✓ 1 n ,..., 1 n ◆ = ⇡ qPt = ⇡Pt +X n i=2 civiPt = ⇡ +X n i=2 cit ivi ⇡ = ✓ 1 n ,..., 1 n ◆ stationary
=(n,P)P is symmetric stationary=((元…是 eigenvalues:1=1≥λ2≥··≥λn orthonormal eigenbasis v1,v2,...V q∈[0,l]is a distribution where ci=qTv; qPt=πPt+cPt=元+cX i=2 2=2
M = ([n], P) P is symmetric eigenvalues : orthonormal eigenbasis : v1, v2, ..., vn 1 = 1 2 ··· n ⇡ = ✓ 1 n ,..., 1 n ◆ stationary q 2 [0, 1]n is a distribution where ci = qT vi qPt = ⇡Pt +X n i=2 civiPt = ⇡ +X n i=2 cit ivi
=(n,P)P is symmetric stationary-(元是 eigenvalues:1=λ1≥λ2≥·≥入n orthonormal eigenbasis V1,v2,...,Vn q∈[0,1is a distribution where ci=gTv; o-t-②sv唇ew (Cauchy-Schwarz) sv define 入max≌max{lλ2l,|Xn} ≤VnXinaxllal2≤V冗λimax > r(g)sn+血是 1-入max
M = ([n], P) P is symmetric eigenvalues : orthonormal eigenbasis : v1, v2, ..., vn 1 = 1 2 ··· n ⇡ = ✓ 1 n ,..., 1 n ◆ stationary q 2 [0, 1]n is a distribution where ci = qT vi kqPt ⇡k1 = X n i=2 cit ivi 1 pn X n i=2 cit ivi 2 (Cauchy-Schwarz) = pn vuutX n i=2 c2 i 2t i pnt max vuutX n i=2 c2 i pnt maxkqk2 pnt max (t) pn 2 t max pn 2 et(1max) ⌧ (✏) 1 2 ln n + ln 1 2✏ 1 max define max , max{|2|, |n|}
=(2,P)stationary distribution: p:distribution at time when initial state isx △x(t)=lp-πlTv △(t)=max△z(t) x∈2 Tr(e)=min{t|△x(t)≤}T(e)=max Ta(e) x∈2 Theorem P is symmetric,with eigenvaluesλ1≥λ2≥·≥入n Let Amax max{A2l,An} T(e)≤ 2Inn+In zc 1-入max
M = (⌦, P) x(t) = kp(t) x ⇡kT V (t) = max x2⌦ x(t) ⌧x(✏) = min{t | x(t) ✏} ⌧ (✏) = max x2⌦ ⌧x(✏) stationary distribution: ⇡ p(t) x : distribution at time t when initial state is x ⌧ (✏) 1 2 ln n + ln 1 2✏ 1 max P is symmetric, with eigenvalues 1 2 ··· n Let Theorem max = max{|2|, |n|}
Reversibility ergodic detailed balance equation: flow T(x)P(x,y)=T(y)P(y,x) time-reversible Markoy chain: 3π,廿,x,y∈2,π(c)P(xc,y)=π(y)P(y,x) stationary distribution: (πP)y=π(c)P(x,)=π(g)P(,)=π(g) C C time-reversible: when start fromπ (X0,X1,,Xn)~(Xn,Xn-1,,X0)
Reversibility detailed balance equation: time-reversible Markov chain: 9⇡, 8, x, y 2 ⌦, ⇡(x)P(x, y) = ⇡(y)P(y, x) stationary distribution: (⇡P)y = X x ⇡(x)P(x, y) = X x ⇡(y)P(y, x) = ⇡(y) time-reversible: (X0, X1,...,Xn) (Xn, Xn1,...,X0) when start from ⇡ ⇡(x)P(x, y) = ⇡(y)P(y, x) ergodic flow