正在加载图片...
1.INTRODUCTION The probabilistic graphical models provide a rich language for describing high-dimensional dis- tributions in terms of the dependence structures between random variables.The Markov random filed (MRF)is a basic graphical model that encodes pairwise interactions of complex systems.Given a graph G=(V,E),each vertex v E V is associated with a function:O-R,called the vertex potential,on a finite domain O=g of g spin states,and each edge eeE is associated with a symmetric function e:O2R,called the edge potential,which describes a pairwise interaction.Together,these induce a probability distribution u over all configurations o ev: o)ex(H)》=ep(∑p(o)+ pe(ou,ou)川 uev e=fu,U)EE This distribution u is known as the Gibbs distribution and H(o)is the Hamiltonian.It arises naturally from various physical models,statistics or learning problems,and combinatorial problems in computer science [MM09,KFB09]. The probabilistic inference is one of the most fundamental computational problems in graphical model.Some basic inference problems ask to calculate the marginal distribution,conditional distri- bution,or maximum-a-posteriori probabilities of one or several random variables [WJ08].Sampling is perhaps the most widely used approach for probabilistic inference.Given a graphical model,inde- pendent samples are drawn from the Gibbs distribution and certain statistics are computed using the samples to give estimates for the inferred quantity.For most typical inference problems,such statistics are easy to compute once the samples are given,for instance,for estimating the marginal distribution on a variable subset S,the statistics is the frequency of each configuration in Os among the samples, thus the cost for inference is dominated by the cost for generating random samples [JVV86,SVV09]. The classic probabilistic inference assumes a static setting,where the input graphical model is fixed. In today's application,dynamically changing graphical models naturally arise in many scenarios.In various practical algorithms for learning graphical models,e.g.the contrastive divergence algorithm for leaning the restricted Boltzmann machine [Hin12]and the iterative proportional fitting algorithm for maximum likelihood estimation of graphical models [WJ08],the optimal model I*is obtained by updating the parameters of the graphical model iteratively(usually by gradient descent),which gen- erates a sequence of graphical models I,12,...,IM,with the goal that IM is a good approximation of I'.Also in the study of the multivariate time-series data,the dynamic Gaussian graphical mod- els [CW*07],multiregression dynamic model [OS93],dynamic graphical model [OS92],and dynamic chain graph models [AQ+17],are all dynamically changing graphical models and have been used in a variety of applications.Meanwhile,with the advent of Big Data,scalable machine learning systems need to deal with continuously evolving graphical models(see e.g.[RKD+19]and [SWA09]). The theoretical studies of probabilistic inference in dynamically changing graphical models are lack- ing.In the aforementioned scenarios in practice,it is common that a sequence of graphical models is presented with time,where any two consecutive graphical models can differ from each other in all potentials but by a small total amount.Recomputing the inference problem from scratch at every time when the graphical model is changed,can give correct solution,but is very wasteful.A fundamental question is whether probabilsitic inference can be solved dynamically and efficiently. In this paper,we study the problem of probabilistic inference in an MRF when the MRF itself is changing dynamically with time.At each time,the whole graphical model,including all vertices and edges as well as their potentials,are subject to changes.Such non-local updates are very general and cover all applications mentioned above.The problem of dynamic inference then asks to maintain a correct answer to the inference in a dynamically changing MRF with low incremental cost proportional to the amount of changes made to the graphical model at each time. 1.1.Our results.We give a dynamic algorithm for sampling-based probabilistic inferences.Given an MRF instance with n vertices,suppose that N(n)independent samples are sufficient to give an approximate solution to the inference problem,where N:N+-N+is a polynomial function.We give dynamic algorithms for general inference problems on dynamically changing MRF.1. Introduction The probabilistic graphical models provide a rich language for describing high-dimensional dis￾tributions in terms of the dependence structures between random variables. The Markov random filed (MRF) is a basic graphical model that encodes pairwise interactions of complex systems. Given a graph G = (V, E), each vertex v ∈ V is associated with a function φv : Q → R, called the vertex potential, on a finite domain Q = [q] of q spin states, and each edge e ∈ E is associated with a symmetric function φe : Q 2 → R, called the edge potential, which describes a pairwise interaction. Together, these induce a probability distribution µ over all configurations σ ∈ Q V : µ(σ) ∝ exp(H(σ)) = exp  Õ v ∈V φv (σv ) + Õ e={u,v }∈E φe (σu, σv )  . This distribution µ is known as the Gibbs distribution and H(σ) is the Hamiltonian. It arises naturally from various physical models, statistics or learning problems, and combinatorial problems in computer science [MM09, KFB09]. The probabilistic inference is one of the most fundamental computational problems in graphical model. Some basic inference problems ask to calculate the marginal distribution, conditional distri￾bution, or maximum-a-posteriori probabilities of one or several random variables [WJ08]. Sampling is perhaps the most widely used approach for probabilistic inference. Given a graphical model, inde￾pendent samples are drawn from the Gibbs distribution and certain statistics are computed using the samples to give estimates for the inferred quantity. For most typical inference problems, such statistics are easy to compute once the samples are given, for instance, for estimating the marginal distribution on a variable subset S, the statistics is the frequency of each configuration in Q S among the samples, thus the cost for inference is dominated by the cost for generating random samples [JVV86, ŠVV09]. The classic probabilistic inference assumes a static setting, where the input graphical model is fixed. In today’s application, dynamically changing graphical models naturally arise in many scenarios. In various practical algorithms for learning graphical models, e.g. the contrastive divergence algorithm for leaning the restricted Boltzmann machine [Hin12] and the iterative proportional fitting algorithm for maximum likelihood estimation of graphical models [WJ08], the optimal model I ∗ is obtained by updating the parameters of the graphical model iteratively (usually by gradient descent), which gen￾erates a sequence of graphical models I1, I2, · · · , IM , with the goal that IM is a good approximation of I ∗ . Also in the study of the multivariate time-series data, the dynamic Gaussian graphical mod￾els [CW+ 07], multiregression dynamic model [QS93], dynamic graphical model [QS92], and dynamic chain graph models [AQ+ 17], are all dynamically changing graphical models and have been used in a variety of applications. Meanwhile, with the advent of Big Data, scalable machine learning systems need to deal with continuously evolving graphical models (see e.g. [RKD+ 19] and [SWA09]). The theoretical studies of probabilistic inference in dynamically changing graphical models are lack￾ing. In the aforementioned scenarios in practice, it is common that a sequence of graphical models is presented with time, where any two consecutive graphical models can differ from each other in all potentials but by a small total amount. Recomputing the inference problem from scratch at every time when the graphical model is changed, can give correct solution, but is very wasteful. A fundamental question is whether probabilsitic inference can be solved dynamically and efficiently. In this paper, we study the problem of probabilistic inference in an MRF when the MRF itself is changing dynamically with time. At each time, the whole graphical model, including all vertices and edges as well as their potentials, are subject to changes. Such non-local updates are very general and cover all applications mentioned above. The problem of dynamic inference then asks to maintain a correct answer to the inference in a dynamically changing MRF with low incremental cost proportional to the amount of changes made to the graphical model at each time. 1.1. Our results. We give a dynamic algorithm for sampling-based probabilistic inferences. Given an MRF instance with n vertices, suppose that N(n) independent samples are sufficient to give an approximate solution to the inference problem, where N : N + → N + is a polynomial function. We give dynamic algorithms for general inference problems on dynamically changing MRF. 1
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有