正在加载图片...
Suppose that the current MRF has n vertices and bounded maximum degree,and each update to the MRF may change the underlying graph or all vertex/edge potentials,as long as the total amount of changes is bounded.Our algorithm maintains an approximate solution to the inference with O(nN(n)) space cost,and with O(N(n)+n)incremental time cost upon each update,assuming that the MRFs sat- isfy the Dobrushin-Shlosman condition [DS85a,DS85b,DS87].The condition has been widely used to imply the efficiency of Markov chain Monte Carlo(MCMC)sampling(e.g.see [Hay06,DGJ08]).Com- pared to the static algorithm,which requires (nN(n))time for redrawing all N(n)samples each time, our dynamic algorithm significantly improves the time cost with an (min{n,N(n)})-factor speedup. On specific models,the Dobrushin-Shlosman condition has been established in the literature,which directly gives us following efficient dynamic inference algorithms,with O(nN(n))space cost and O(N(n)+n)time cost per update,on graphs with n vertices and maximum degree A=O(1): for Ising model with temperaturesatisfyinge1-which is close to the uniqueness thresholde=1-,beyond which the static versions of sampling or marginal inference problem for anti-ferromagnetic Ising model is intractable [GSV16,GSV15]; for hardcore model with fugacity A<,which matches the best bound known for sam- pling algorithm with near-linear running time on general graphs with bounded maximum de- gree [Vig99,LV99,DG00]; for proper g-coloring with g 2A,which matches the best bound known for sampling algo- rithm with near-linear running time on general graphs with bounded maximum degree [Jer95]. Our dynamic inference algorithm is based on a dynamic sampling algorithm,which efficiently main- tains N(n)independent samples for the current MRF while the MRF is subject to changes.More specif- ically,we give a dynamic version of the Gibbs sampling algorithm,a local Markov chain for sampling from the Gibbs distribution that has been studied extensively.Our techniques are based on:(1)cou- plings for dynamic instances of graphical models;and(2)dynamic data structures for representing single-site Markov chains so that the couplings can be realized algorithmically in sub-linear time.Both these techniques are of independent interests,and can be naturally extended to more general settings with multi-body interactions. Our results show that on dynamically changing graphical models,sampling-based probabilistic in- ferences can be solved significantly faster than rerunning the static algorithm at each time.This has practical significance in speeding up the iterative procedures for learning graphical models. 1.2.Related work.The problem of dynamic sampling from graphical models was introduced very recently in [FVY19].There,a dynamic sampling algorithm was given for sampling from graphical models with soft constraints,and can only deal with local updates that change a single vertex or edge at each time.And the regimes for such dynamic sampling algorithm to be efficient with small incre- mental cost,are much more restrictive than the conditions for the rapid mixing of Markov chains. Our algorithm greatly improves the regimes for efficient dynamic sampling for the Ising and hardcore models in [FVY19],and for the first time,can handle non-local updates that change all vertex/edge potentials simultaneously.Besides,the dynamic/online sampling from log-concave distributions was also studied in [NR17,LMV19]. Another related topic is the dynamic graph problems,which ask to maintain a solution (e.g.span- ners [FG19,NSWN17,WN17]or shortest paths [BC16,HKN16,HKN14])while the input graph is dy- namically changing.More recently,important progresses have been made on dynamically maintaining structures that are related to graph random walks,such as spectral sparsifier [DGGP19,ADK16]or effective resistances [DGGP18,GHP18].Instead of one particular solution,dynamic inference prob- lems ask to maintain a statistics of an exponential-sized probability space described by a dynamically changing graphical model. 1.3.Organization of the paper.In Section 2,we formally introduce the dynamic inference problem In Section 3,we formally state the main results.Preliminaries are given in Section 4.In Section 5,we outline our dynamic inference algorithm.In Section 6,we present the algorithms for dynamic Gibbs sampling.The analyses of these dynamic sampling algorithms are given in Section 7.The proof of the main theorem on dynamic inference is given in Section 8.The conclusion is given in Section 9. 2Suppose that the current MRF has n vertices and bounded maximum degree, and each update to the MRF may change the underlying graph or all vertex/edge potentials, as long as the total amount of changes is bounded. Our algorithm maintains an approximate solution to the inference with Oe(nN(n)) space cost, and with Oe(N(n)+n) incremental time cost upon each update, assuming that the MRFs sat￾isfy the Dobrushin-Shlosman condition [DS85a, DS85b, DS87]. The condition has been widely used to imply the efficiency of Markov chain Monte Carlo (MCMC) sampling (e.g. see [Hay06, DGJ08]). Com￾pared to the static algorithm, which requires Ω(nN(n)) time for redrawing all N(n) samples each time, our dynamic algorithm significantly improves the time cost with an Ωe(min{n, N(n)})-factor speedup. On specific models, the Dobrushin-Shlosman condition has been established in the literature, which directly gives us following efficient dynamic inference algorithms, with Oe(nN(n)) space cost and Oe(N(n) + n) time cost per update, on graphs with n vertices and maximum degree ∆ = O(1): • for Ising model with temperature β satisfying e −2|β | > 1− 2 ∆+1 , which is close to the uniqueness threshold e −2|βc | = 1 − 2 ∆ , beyond which the static versions of sampling or marginal inference problem for anti-ferromagnetic Ising model is intractable [GŠV16, GŠV15]; • for hardcore model with fugacity λ < 2 ∆−2 , which matches the best bound known for sam￾pling algorithm with near-linear running time on general graphs with bounded maximum de￾gree [Vig99, LV99, DG00]; • for proper q-coloring with q > 2∆, which matches the best bound known for sampling algo￾rithm with near-linear running time on general graphs with bounded maximum degree [Jer95]. Our dynamic inference algorithm is based on a dynamic sampling algorithm, which efficiently main￾tains N(n) independent samples for the current MRF while the MRF is subject to changes. More specif￾ically, we give a dynamic version of the Gibbs sampling algorithm, a local Markov chain for sampling from the Gibbs distribution that has been studied extensively. Our techniques are based on: (1) cou￾plings for dynamic instances of graphical models; and (2) dynamic data structures for representing single-site Markov chains so that the couplings can be realized algorithmically in sub-linear time. Both these techniques are of independent interests, and can be naturally extended to more general settings with multi-body interactions. Our results show that on dynamically changing graphical models, sampling-based probabilistic in￾ferences can be solved significantly faster than rerunning the static algorithm at each time. This has practical significance in speeding up the iterative procedures for learning graphical models. 1.2. Related work. The problem of dynamic sampling from graphical models was introduced very recently in [FVY19]. There, a dynamic sampling algorithm was given for sampling from graphical models with soft constraints, and can only deal with local updates that change a single vertex or edge at each time. And the regimes for such dynamic sampling algorithm to be efficient with small incre￾mental cost, are much more restrictive than the conditions for the rapid mixing of Markov chains. Our algorithm greatly improves the regimes for efficient dynamic sampling for the Ising and hardcore models in [FVY19], and for the first time, can handle non-local updates that change all vertex/edge potentials simultaneously. Besides, the dynamic/online sampling from log-concave distributions was also studied in [NR17, LMV19]. Another related topic is the dynamic graph problems, which ask to maintain a solution (e.g. span￾ners [FG19, NSWN17, WN17] or shortest paths [BC16, HKN16, HKN14]) while the input graph is dy￾namically changing. More recently, important progresses have been made on dynamically maintaining structures that are related to graph random walks, such as spectral sparsifier [DGGP19, ADK+ 16] or effective resistances [DGGP18, GHP18]. Instead of one particular solution, dynamic inference prob￾lems ask to maintain a statistics of an exponential-sized probability space described by a dynamically changing graphical model. 1.3. Organization of the paper. In Section 2, we formally introduce the dynamic inference problem. In Section 3, we formally state the main results. Preliminaries are given in Section 4. In Section 5, we outline our dynamic inference algorithm. In Section 6, we present the algorithms for dynamic Gibbs sampling. The analyses of these dynamic sampling algorithms are given in Section 7. The proof of the main theorem on dynamic inference is given in Section 8. The conclusion is given in Section 9. 2
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有