正在加载图片...
4 Variants of Support Vector Machines 4.1 Least-Squares Support Vector Machines 163 4.1.1 Two-Class Least-Squares Support Vector Machines.. 1( 4.1.2 One-Against-All Least-Squares Support Vector Machines 166 4.1.3 Pairwise Least-Squares Support Vector Machines 4.1.4 All-at-Once Least-Squares Support Vector Machines.. 169 4.1.5 Performance Comparison 170 4.2 Linear Programming Support Vector Machines 174 4.2.1 Architecture 175 4.2.2 Performance evaluation 4.3 Sparse Support Vector Machines 180 4.3.1 Several Approaches for Sparse Support Vector machine 4.3.2 Idea 183 4.3.3 Support Vector Machines Trained in the Empirical Feature Space 184 4.3.4 Selection of Linearly Independent Data 4.3.5 Performance evaluation 4.4 Performance Comparison of Different Classifiers 192 4.5 Robust Support Vector Machines 6 Bayesian Support Vector Machines 4.6.1 One-Dimensional Bayesian Decision Functions 4.6.2 Parallel Displacement of a Hyperplane 4. 6.3 Normal Test 4.7 Incremental Training 4.7.1 Overview 201 1.7.2 Incremental Training Using Hyperspheres 4.8 Learning Using Privileged Information 213 4.9 Semi-Supervised Learning 4.10 Multiple Classifier Systems 4.11 Multiple Kernel Learning 218 4.12 Confidence Level 4.13 Visualization References 5 Training Methods 5.1 Preselecting Support Vector Candidates 227 5.1.1 Approximation of Boundary Data 5.1.2 Performance evaluation 5.2 Decomposition Techniques 231 5.3 KKT Conditions Revisited 5.4 Overview of Training Methods 239 5.5 PrimalDual Interior-Point methodsContents xv 4 Variants of Support Vector Machines . . . . . . . . . . . . . . . . . . . . . 163 4.1 Least-Squares Support Vector Machines . . . . . . . . . . . . . . . . . . . 163 4.1.1 Two-Class Least-Squares Support Vector Machines . . . 164 4.1.2 One-Against-All Least-Squares Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166 4.1.3 Pairwise Least-Squares Support Vector Machines . . . . . 168 4.1.4 All-at-Once Least-Squares Support Vector Machines . . 169 4.1.5 Performance Comparison . . . . . . . . . . . . . . . . . . . . . . . . . . 170 4.2 Linear Programming Support Vector Machines . . . . . . . . . . . . . 174 4.2.1 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 4.2.2 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 4.3 Sparse Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . . . 180 4.3.1 Several Approaches for Sparse Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 4.3.2 Idea . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 4.3.3 Support Vector Machines Trained in the Empirical Feature Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 4.3.4 Selection of Linearly Independent Data. . . . . . . . . . . . . . 187 4.3.5 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 189 4.4 Performance Comparison of Different Classifiers . . . . . . . . . . . . 192 4.5 Robust Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . . . 196 4.6 Bayesian Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . 197 4.6.1 One-Dimensional Bayesian Decision Functions . . . . . . . 199 4.6.2 Parallel Displacement of a Hyperplane . . . . . . . . . . . . . . 200 4.6.3 Normal Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 4.7 Incremental Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 4.7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 4.7.2 Incremental Training Using Hyperspheres . . . . . . . . . . . 204 4.8 Learning Using Privileged Information . . . . . . . . . . . . . . . . . . . . 213 4.9 Semi-Supervised Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216 4.10 Multiple Classifier Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217 4.11 Multiple Kernel Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218 4.12 Confidence Level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 4.13 Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220 5 Training Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227 5.1 Preselecting Support Vector Candidates . . . . . . . . . . . . . . . . . . . 227 5.1.1 Approximation of Boundary Data . . . . . . . . . . . . . . . . . . 228 5.1.2 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 230 5.2 Decomposition Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231 5.3 KKT Conditions Revisited . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234 5.4 Overview of Training Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 5.5 Primal–Dual Interior-Point Methods . . . . . . . . . . . . . . . . . . . . . . 242
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有