正在加载图片...
14 Support Vector Machines in R SVM classification plot 2.5 0 00 00 2.0 XX O XX X 0000 XX X 1.5 XX XX upiM'leed 0 00X 0 000000 00 1.0 0 0 0.5 eo* 00 00000 6 Petal.Length Figure 2:SVM plot visualizing the iris data.Support vectors are shown as 'X',true classes are highlighted through symbol color,predicted class regions are visualized using colored background. 4 -1.364424 5 -1.423417 6 -1.158232 Probability values can be obtained in a similar way. In the next example,we again train a classification model on the spam data.This time, however,we will tune the hyper-parameters on a subsample using the tune framework of e1071: tobj <tune.svm(type ~.data spam_train[1:300, +],gamma-10^(-6:-3),cost=10^(1:2)) summary(tobj) Parameter tuning of svm': sampling method:10-fold cross validation -best parameters: gamma cost 0.0011014 Support Vector Machines in R setosa versicolor virginica 1 2 3 4 5 6 0.5 1.0 1.5 2.0 2.5 o o o o oo o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o oo o o o o o o o o o o o o o o o o o o x x x xx xx x x x x x x x x x x x x x x x SVM classification plot Petal.Length Petal.Width Figure 2: SVM plot visualizing the iris data. Support vectors are shown as ‘X’, true classes are highlighted through symbol color, predicted class regions are visualized using colored background. 4 -1.364424 5 -1.423417 6 -1.158232 Probability values can be obtained in a similar way. In the next example, we again train a classification model on the spam data. This time, however, we will tune the hyper-parameters on a subsample using the tune framework of e1071: > tobj <- tune.svm(type ~ ., data = spam_train[1:300, + ], gamma = 10^(-6:-3), cost = 10^(1:2)) > summary(tobj) Parameter tuning of ‘svm’: - sampling method: 10-fold cross validation - best parameters: gamma cost 0.001 10
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有