正在加载图片...
A Discriminative Approach to Topic-based Citation Recommendation We see that the recommended papers are topic dependent. By nature, the problem of citation recommendation can be formalized as topic discovery, reference papers rec ommendation, and matching of the recommended papers with the citation sentences. 3 Our Approach At a high level, our approach primarily consists of three steps 1. We propose a two-layer Restricted Boltzmann Machine(RBM) model, referred to as RBM-CS. Given a collection of papers with citation relationship, the model learns a mixture of topic distribution over paper contents and citation relationships 2. We present a method to rank papers for a given citation context, based on the learned topic model. We take the top ranked papers as the recommended papers 3. We describe a method to find the correspondence between the recommended papers and the citation sentences 3.1 The rBM-CS model Restricted Boltzmann Machines(RBMs)[8] are undirected graphical models that use a layer of hidden variables to model a( topic)distribution over visible variables. In th work, we propose a two-layer RBM model, called RBM-CS, to jointly model papers and itations Graphical representation of the RBM-CS model is shown in Figure 2. We see that in RBM-CS, the hidden layer h is associated with two visible layers: words w and citation relationships l, respectively coupling with an interaction matrix M and U. The basic idea in RBM-CS is to capture the topic distribution of papers with a hidden topic layer, which is conditioned on both words and citation relationships. Words and citation elationship are considered to be generated from the hidden topics independently Fig. 2: Graphical representation of the RBM-CS model To train a graphical model, we can consider maximization of the generative log likelihood log p(w, I). However, we are dealing with a predictive problem, our interests ltimately only lie in correct prediction p( w), not necessarily to have a good p(w) Therefore, we define a discriminative objective function by a conditional log-likelihood L=∑gpuw)=∑(nw)A Discriminative Approach to Topic-based Citation Recommendation 3 We see that the recommended papers are topic dependent. By nature, the problem of citation recommendation can be formalized as topic discovery, reference papers rec￾ommendation, and matching of the recommended papers with the citation sentences. 3 Our Approach At a high level, our approach primarily consists of three steps: 1. We propose a two-layer Restricted Boltzmann Machine (RBM) model, referred to as RBM-CS. Given a collection of papers with citation relationship, the model learns a mixture of topic distribution over paper contents and citation relationships. 2. We present a method to rank papers for a given citation context, based on the learned topic model. We take the top ranked papers as the recommended papers. 3. We describe a method to find the correspondence between the recommended papers and the citation sentences. 3.1 The RBM-CS Model Restricted Boltzmann Machines (RBMs) [8] are undirected graphical models that use a layer of hidden variables to model a (topic) distribution over visible variables. In this work, we propose a two-layer RBM model, called RBM-CS, to jointly model papers and citations. Graphical representation of the RBM-CS model is shown in Figure 2. We see that in RBM-CS, the hidden layer h is associated with two visible layers: words w and citation relationships l, respectively coupling with an interaction matrix M and U. The basic idea in RBM-CS is to capture the topic distribution of papers with a hidden topic layer, which is conditioned on both words and citation relationships. Words and citation relationship are considered to be generated from the hidden topics independently. l ... l l Fig. 2: Graphical representation of the RBM-CS model. To train a graphical model, we can consider maximization of the generative log￾likelihood log p(w, l). However, we are dealing with a predictive problem, our interests ultimately only lie in correct prediction p(l|w), not necessarily to have a good p(w). Therefore, we define a discriminative objective function by a conditional log-likelihood: L = XD d log p(ld|wd) = XD d log ÃYL j=1 p(lj |wd) ! (1)
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有