正在加载图片...
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING,VOL.X,NO.X,XXX 200X 6 instance Selection),based on a novel instance-based feature mapping and the 1-norm SVM model for simultaneous feature selection and classification. However,most of the existing methods cannot answer the essential question giving rise to the gap between multiple-instance learning and single-instance learning:which are true positive instances and what property do the true positive instances have?This motivates the work in this paper. 3 MILD:MULTIPLE-INSTANCE LEARNING VIA DISAMBIGUATION In this section,our disambiguation method is described in detail,and then two feature rep- resentation schemes are presented for instance-level classification and bag-level classification, respectively,based on which the MIL problem is converted into a standard single-instance leaning problem that can be directly solved by SIL algorithms,such as SVM.In addition,multi-view learning [12]can be easily adapted for MIL by combining the two views,the instance-level view and the bag-level view,of the bags. 3.1 Notations B denotes a positive bag and B.denotes a negative bag.When the label of a bag does not matter,we simply denote the bag as B.B denotes an instance in a positive bag B and B is an instance in a negative bag B.Let B={B,B,...,B,B,B2,...,B-}denote the set of n+positive and n-negative training bags.I(Bi)E+1,-1}is the bag label of Bi and l(Bj)[+1,-1)is the instance label of Bij.In general,we always represent all instances as feature vectors of the same dimensionality.Hence,in this paper,an instance also refers to a feature vector. 3.2 Disambiguation According to the MIL formulation,all instances in the negative bags are negative and hence there exists no ambiguity in the negative bags if there is no labeling noise.As for the positive bags,the only thing we know is that each positive bag must contain at least one true positive instance,but it may also contain many negative instances.Thus,ambiguity arises in the positive bags since we do not know the labels of the instances there.The goal of disambiguation is to identify the true positive instances in the positive bags. March 1.2009 DRAFTIEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, VOL. X, NO. X, XXX 200X 6 instance Selection), based on a novel instance-based feature mapping and the 1-norm SVM model for simultaneous feature selection and classification. However, most of the existing methods cannot answer the essential question giving rise to the gap between multiple-instance learning and single-instance learning: which are true positive instances and what property do the true positive instances have? This motivates the work in this paper. 3 MILD: MULTIPLE-INSTANCE LEARNING VIA DISAMBIGUATION In this section, our disambiguation method is described in detail, and then two feature rep￾resentation schemes are presented for instance-level classification and bag-level classification, respectively, based on which the MIL problem is converted into a standard single-instance leaning problem that can be directly solved by SIL algorithms, such as SVM. In addition, multi-view learning [12] can be easily adapted for MIL by combining the two views, the instance-level view and the bag-level view, of the bags. 3.1 Notations B + i denotes a positive bag and B − i denotes a negative bag. When the label of a bag does not matter, we simply denote the bag as Bi . B + ij denotes an instance in a positive bag B + i and B − ij is an instance in a negative bag B − i . Let B = {B + 1 , B+ 2 , . . . , B+ n+ , B− 1 , B− 2 , . . . , B− n− } denote the set of n + positive and n − negative training bags. l(Bi) ∈ {+1, −1} is the bag label of Bi and l(Bij ) ∈ {+1, −1} is the instance label of Bij . In general, we always represent all instances as feature vectors of the same dimensionality. Hence, in this paper, an instance also refers to a feature vector. 3.2 Disambiguation According to the MIL formulation, all instances in the negative bags are negative and hence there exists no ambiguity in the negative bags if there is no labeling noise. As for the positive bags, the only thing we know is that each positive bag must contain at least one true positive instance, but it may also contain many negative instances. Thus, ambiguity arises in the positive bags since we do not know the labels of the instances there. The goal of disambiguation is to identify the true positive instances in the positive bags. March 1, 2009 DRAFT
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有