正在加载图片...
Contents 5.5.1 Primal-Dual Interior-Point methods for linear Programming 242 5.5.2 Primal-Dual Interior-Point Methods for Quadratic Programming 5.5.3 Performance evaluation 5.6 Steepest Ascent Methods and Newton's 5.6.1 Solving Quadratic Programming Problems Without Constraints 252 5.6.2 Training of Ll Soft-Margin Support Vector Machines. 254 5.6.3 Sequential Minimal Optimization 259 5.6.4 Training of L2 Soft-Margin Support Vector Machines. 260 5.6.5 Performance evaluation 5.7 Batch Training by Exact Incremental Training 262 5.7.1 KKT Conditions 263 5.7.2 Training by Solving a Set of Linear Equations 5.7.3 Performance evaluation 5.8 Active Set Training in Primal and Dual 5.8.1 Training Support Vector Machines in the Primal 273 5.8.2 Comparison of Training Support Vector Machines in the primal and the dual 276 5.8.3 Performance Evaluation 5.9 Training of Linear Programming Support Vector Machines.. 281 5.9.1 Decomposition Techniques 282 5.9.2 Decomposition Techniques for Linear Programming Support Vector Machines 5.9.3 Computer Experim References 299 6 Kernel-Based methods 6.1 Kernel Least Squares 6.1.1 Algorithm 6.1.2 Performance evaluation 6.2 Kernel Principal Component Analysis 311 6.3 Kernel mahalanobis distance 6.3.1 SVD-Based Kernel mahalanobis distance 315 6.3.2 KPCA-Based Mahalanobis Distance 6.4 Principal Component Analysis in the Empirical Feature Space 6.5 Kernel Discriminant Analysis 6.5.1 Kernel Discriminant Analysis for Two-Class Problems. 321 6.5.2 Linear Discriminant Analysis for Two-Class Problems in the Empirical Feature Space 6.5.3 Kernel Discriminant Analysis for Multiclass Problems. 325 Referencesxvi Contents 5.5.1 Primal–Dual Interior-Point Methods for Linear Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 5.5.2 Primal–Dual Interior-Point Methods for Quadratic Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246 5.5.3 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 248 5.6 Steepest Ascent Methods and Newton’s Methods . . . . . . . . . . . 252 5.6.1 Solving Quadratic Programming Problems Without Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252 5.6.2 Training of L1 Soft-Margin Support Vector Machines . 254 5.6.3 Sequential Minimal Optimization . . . . . . . . . . . . . . . . . . . 259 5.6.4 Training of L2 Soft-Margin Support Vector Machines . 260 5.6.5 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 261 5.7 Batch Training by Exact Incremental Training . . . . . . . . . . . . . 262 5.7.1 KKT Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263 5.7.2 Training by Solving a Set of Linear Equations . . . . . . . . 264 5.7.3 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 272 5.8 Active Set Training in Primal and Dual . . . . . . . . . . . . . . . . . . . 273 5.8.1 Training Support Vector Machines in the Primal . . . . . 273 5.8.2 Comparison of Training Support Vector Machines in the Primal and the Dual . . . . . . . . . . . . . . . . . . . . . . . . . . 276 5.8.3 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 279 5.9 Training of Linear Programming Support Vector Machines . . . 281 5.9.1 Decomposition Techniques . . . . . . . . . . . . . . . . . . . . . . . . . 282 5.9.2 Decomposition Techniques for Linear Programming Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . . . . 289 5.9.3 Computer Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . 297 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299 6 Kernel-Based Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305 6.1 Kernel Least Squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305 6.1.1 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305 6.1.2 Performance Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 308 6.2 Kernel Principal Component Analysis . . . . . . . . . . . . . . . . . . . . . 311 6.3 Kernel Mahalanobis Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314 6.3.1 SVD-Based Kernel Mahalanobis Distance. . . . . . . . . . . . 315 6.3.2 KPCA-Based Mahalanobis Distance . . . . . . . . . . . . . . . . 318 6.4 Principal Component Analysis in the Empirical Feature Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 6.5 Kernel Discriminant Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320 6.5.1 Kernel Discriminant Analysis for Two-Class Problems . 321 6.5.2 Linear Discriminant Analysis for Two-Class Problems in the Empirical Feature Space . . . . . . . . . . . . . . . . . . . . . 324 6.5.3 Kernel Discriminant Analysis for Multiclass Problems . 325 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有