正在加载图片...
Preface an unknown data sample into a definite class, it is classified into that class In this formulation, if more than one decision function classify a data sample into definite classes or if no decision functions classify the data sample into a definite class, the data sample is unclassifiable. Another problem of support vector machines is slow training. Because support vector machines are trained by solving a quadratic programming oblem with the number of variables equal to the number of training data, training is slow for a large number of training data. To resolve unclassifiable regions for multiclass support vector machines we propose fuzzy support vector machines and decision-tree-based support vector machines To accelerate training, in this book, we discuss two approaches: selection of important data for training support vector machines before training and training by decomposing the optimization problem into two subproblems To improve generalization ability of non-SVM-type classifiers, we introduc the ideas of support vector machines to the classifiers: neural network training incorporating maximizing margins and a kernel version of a fuzzy classifier with ellipsoidal regions 6, pp 90-93, 119-139 In Chapter 1, we discuss two types of decision functions: direct decision functions, in which the class boundary is given by the curve where the deci- sion function vanishes, and the indirect decision function, in which the class boundary is given by the curve where two decision functions take on the same valu In Chapter 2, we discuss the architecture of support vector machines for two-class classification problems. First we explain hard-margin support vector machines, which are used when the classification problem is linearly separa- ble, namely, the training data of two classes are separated by a single hy perplane. Then, introducing slack variables for the training data, we extend hard-margin support vector machines so that they are applicable to insepara ole problems. There are two types of support vector machines: LI soft-margin support vector machines and L2 soft-margin support vector machines. Here, LI and L2 denote the linear sum and the square sum of the slack variable that are added to the objective function for training. Then we investigate the characteristics of solutions extensively and survey several techniques for estimating the generalization ability of support vector machines In Chapter 3, we discuss some methods for multiclass problems: one- against-all support vector machines, in which each class is separated from the remaining classes: pairwise support vector machines, in which one class is separated from another class; the use of error-correcting output codes for resolving unclassifiable regions; and all-at-once support vector machines, in which decision functions for all the classes are determined at once. To resolve unclassifiable regions, in addition to error-correcting codes, we discuss fuzzy support vector machines with membership functions and decision-tree-based support vector machines. To compare several methods for multiclass prob-viii Preface an unknown data sample into a definite class, it is classified into that class. In this formulation, if more than one decision function classify a data sample into definite classes or if no decision functions classify the data sample into a definite class, the data sample is unclassifiable. Another problem of support vector machines is slow training. Because support vector machines are trained by solving a quadratic programming problem with the number of variables equal to the number of training data, training is slow for a large number of training data. To resolve unclassifiable regions for multiclass support vector machines we propose fuzzy support vector machines and decision-tree-based support vector machines. To accelerate training, in this book, we discuss two approaches: selection of important data for training support vector machines before training and training by decomposing the optimization problem into two subproblems. To improve generalization ability of non-SVM-type classifiers, we introduce the ideas of support vector machines to the classifiers: neural network training incorporating maximizing margins and a kernel version of a fuzzy classifier with ellipsoidal regions [6, pp. 90–93, 119–139]. In Chapter 1, we discuss two types of decision functions: direct decision functions, in which the class boundary is given by the curve where the deci￾sion function vanishes, and the indirect decision function, in which the class boundary is given by the curve where two decision functions take on the same value. In Chapter 2, we discuss the architecture of support vector machines for two-class classification problems. First we explain hard-margin support vector machines, which are used when the classification problem is linearly separa￾ble, namely, the training data of two classes are separated by a single hy￾perplane. Then, introducing slack variables for the training data, we extend hard-margin support vector machines so that they are applicable to insepara￾ble problems. There are two types of support vector machines: L1 soft-margin support vector machines and L2 soft-margin support vector machines. Here, L1 and L2 denote the linear sum and the square sum of the slack variables that are added to the objective function for training. Then we investigate the characteristics of solutions extensively and survey several techniques for estimating the generalization ability of support vector machines. In Chapter 3, we discuss some methods for multiclass problems: one￾against-all support vector machines, in which each class is separated from the remaining classes; pairwise support vector machines, in which one class is separated from another class; the use of error-correcting output codes for resolving unclassifiable regions; and all-at-once support vector machines, in which decision functions for all the classes are determined at once. To resolve unclassifiable regions, in addition to error-correcting codes, we discuss fuzzy support vector machines with membership functions and decision-tree-based support vector machines. To compare several methods for multiclass prob-
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有