正在加载图片...
Person Movie ) Figure 1:(a) Standard PRM learned on EachMovie dataset(b) Ground Bayesian Network for one Vote object erence slots. For example, if we specialize the Movie class, we implicitly specialize the related vote table into a hierarchy as well. For example, in Figure 3, the Vote class is refined into four different pseudo-classes each associated with one of the hierarchy elements in Action Comedy Thrille basic(H[XD) Definition 2 A Hierarchical Probabilistic relational Model(hPRM) IIH is defined as Romantic Slapstick Comedy Comedy A class hierarchy H[X=(C(X, A set of basic, leaf-node elements basic(HXDE H Figure 2: Sample class hierarchy a subclass indicator attribute X Class E basic(HID 3.2 Overview · For each subclass c∈cx] and attribute a∈ A(X)we have a specialized CPD for c denoted To address the problem described above, we must in troduce a class hierarchy that applies to our dataset P(X A Pa(XA)) and modify the PRM learning procedure to leverage For every class r reachable via a reference slot this class hierarchy in making predictions. In general ain from X we have a specialized CPD for c the class hierarchy can either be provided as input, or denoted P(Y.A Pa(YA)) can be learned directly from the data. We refer to the class hierarchy for class X as H[X]. Figure 2 shows The algorithm for learning an hPRM is very similar to a sample class hierarchy for the EachMovie domain. the algorithm for learning a standard PRM. Instead of H[X] is a DAG that defines an IS-A hierarchy using dealing with the standard set of classes a when evalu- Xe is a direct subclass of Xa(and Xa is a direct su- into the subclasses defined by HIX]. For inference,. he subclass relation over a finite set of subclasses ating structure quality and estimating parameters, ou CXIGet02. For a given c,dEC[X], c< d indicates hPRM algorithm dynamically partitions the datas perclass of Xc). The leaf nodes of H(X represent the similar technique is used, as for any given instance i basic subclasses of the hierarchy, denoted basic(H(X)). of a class, is place in the hierarchy is flagged through In this paper we assume all objects are members of a XClass; using this flag it is possible to associate the basic subclass, although this is not a fundamental limi- proper CPD with a given class instance tation of hPRMs. Each object of class X has a subclass indicator X Class e basic(H(XD, which can either be 3.3 Applying hPRMs to the EachMovie defined manually or learned automatically by a sup- Dataset plementary algorithm. By defining a hierarchy for a lass X in a PRM, we also implicitly specialize the Applying the hPrM framework to the EachMovie classes that are reachable from X via one or more ref- dataset requires a hierarchy to be defined, which isFigure 1: (a) Standard PRM learned on EachMovie dataset (b) Ground Bayesian Network for one Vote object Figure 2: Sample class hierarchy 3.2 Overview To address the problem described above, we must in￾troduce a class hierarchy that applies to our dataset, and modify the PRM learning procedure to leverage this class hierarchy in making predictions. In general, the class hierarchy can either be provided as input, or can be learned directly from the data. We refer to the class hierarchy for class X as H[X]. Figure 2 shows a sample class hierarchy for the EachMovie domain. H[X] is a DAG that defines an IS-A hierarchy using the subclass relation ≺ over a finite set of subclasses C[X] [Get02]. For a given c, d ∈ C[X], c ≺ d indicates Xc is a direct subclass of Xd (and Xd is a direct su￾perclass of Xc). The leaf nodes of H[X] represent the basic subclasses of the hierarchy, denoted basic(H[X]). In this paper we assume all objects are members of a basic subclass, although this is not a fundamental limi￾tation of hPRMs. Each object of class X has a subclass indicator X.Class ∈ basic(H[X]), which can either be defined manually or learned automatically by a sup￾plementary algorithm. By defining a hierarchy for a class X in a PRM, we also implicitly specialize the classes that are reachable from X via one or more ref￾erence slots. For example, if we specialize the Movie class, we implicitly specialize the related V ote table into a hierarchy as well. For example, in Figure 3, the V ote class is refined into four different pseudo-classes, each associated with one of the hierarchy elements in basic(H[X]). Definition 2 A Hierarchical Probabilistic Relational Model (hPRM) ΠH is defined as: • A class hierarchy H[X] = (C[X], ≺) • A set of basic, leaf-node elements basic(H[X]) ∈ H[X] • A subclass indicator attribute X.Class ∈ basic(H[X]) • For each subclass c ∈ C[X] and attribute A ∈ A(X) we have a specialized CPD for c denoted P(Xc .A|P ac (X.A)) • For every class Y reachable via a reference slot chain from X we have a specialized CPD for c denoted P(Y c .A|P ac (Y.A)) The algorithm for learning an hPRM is very similar to the algorithm for learning a standard PRM. Instead of dealing with the standard set of classes X when evalu￾ating structure quality and estimating parameters, our hPRM algorithm dynamically partitions the dataset into the subclasses defined by H[X]. For inference, a similar technique is used, as for any given instance i of a class, i’s place in the hierarchy is flagged through X.Class; using this flag it is possible to associate the proper CPD with a given class instance. 3.3 Applying hPRMs to the EachMovie Dataset Applying the hPRM framework to the EachMovie dataset requires a hierarchy to be defined, which is
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有