正在加载图片...
2.8.4 Effect of Model Selection by Cross-Validation: In realizing high gener alization ability of a support vector machine, selection of kernels and their parameter values, i. e, model selection, is very important. Here I discuss how cross-validation. which is one of the most well-used model selection methods, works to generate a support vector machine with high general- ization ability 3.4 I have deleted the section"Sophisticated Architecture"because it does not work 4.3 Sparse Support Vector Machines: Based on the idea of the empirical fea- ture space, sparse support vector machines, which realize smaller numbers of support vectors than those of support vector machines are discussed 4 Performance Comparison of Different Classifiers: Performance of some types of support vector machines is compared using benchmark data sets 4.8 Learning Using Privileged Information: Incorporating prior knowledge into support vector machines is very useful in improving the generalization ability. Here, one such approach proposed by Vapnik is explained 4.9 Semi-supervised Learning: I have explained the difference between semi- supervised learning and transductive learning 4.10 Multiple Classifier Systems: Committee machines in the first edition are renamed and new materials are added 4. 11 Multiple Kernel Learning: A weighted sum of kernels with positiv weights is also a kernel and is called a multiple kernel. A learning method of support vector machines with multiple kernels is discussed 5.6 Steepest Ascent Methods and Newton's Methods: Steepest ascent meth ods in the first edition are renamed as Newtons methods and steepest ascent methods are explained in Section 5.6.1 5.7 Batch Training by Exact Incremental Training: A batch training method based on incremental training is added 5.8 Active Set Training in Primal and Dual: Training methods in the primal or dual form by variable-size chunking are added. 5.9 Training of Linear Programming Support Vector Machines: Three de- composition techniques for linear programming support vector machines are 6 Kernel-Based Methods: Chapter 8 Kernel-Based Methods in the first edi- tion is placed just after Chapter 5 Training methods and kernel discrimi- nant analysis is added 11.5.3 Active Set Training: Active set training discussed in Section 5.8 extended to function approximation. 7 Variable Selection: Variable selection for support vector regressors isvi Preface 2.8.4 Effect of Model Selection by Cross-Validation: In realizing high gener￾alization ability of a support vector machine, selection of kernels and their parameter values, i.e., model selection, is very important. Here I discuss how cross-validation, which is one of the most well-used model selection methods, works to generate a support vector machine with high general￾ization ability. 3.4 I have deleted the section “Sophisticated Architecture” because it does not work. 4.3 Sparse Support Vector Machines: Based on the idea of the empirical fea￾ture space, sparse support vector machines, which realize smaller numbers of support vectors than those of support vector machines are discussed. 4.4 Performance Comparison of Different Classifiers: Performance of some types of support vector machines is compared using benchmark data sets. 4.8 Learning Using Privileged Information: Incorporating prior knowledge into support vector machines is very useful in improving the generalization ability. Here, one such approach proposed by Vapnik is explained. 4.9 Semi-supervised Learning: I have explained the difference between semi￾supervised learning and transductive learning. 4.10 Multiple Classifier Systems: Committee machines in the first edition are renamed and new materials are added. 4.11 Multiple Kernel Learning: A weighted sum of kernels with positive weights is also a kernel and is called a multiple kernel. A learning method of support vector machines with multiple kernels is discussed. 5.6 Steepest Ascent Methods and Newton’s Methods: Steepest ascent meth￾ods in the first edition are renamed as Newton’s methods and steepest ascent methods are explained in Section 5.6.1. 5.7 Batch Training by Exact Incremental Training: A batch training method based on incremental training is added. 5.8 Active Set Training in Primal and Dual: Training methods in the primal or dual form by variable-size chunking are added. 5.9 Training of Linear Programming Support Vector Machines: Three de￾composition techniques for linear programming support vector machines are discussed. 6 Kernel-Based Methods: Chapter 8 Kernel-Based Methods in the first edi￾tion is placed just after Chapter 5 Training methods and kernel discrimi￾nant analysis is added. 11.5.3 Active Set Training: Active set training discussed in Section 5.8 is extended to function approximation. 11.7 Variable Selection: Variable selection for support vector regressors is added
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有