正在加载图片...
Holdout Cross-validation method ■ oldout method Given data is randomly partitioned into two independent sets n Training set(e.g, 2/3)for model construction n Test set (e.g, 1/3)for accuracy estimation o Random sampling: a variation of holdout o Repeat holdout k times, accuracy= avg. of the accuracies obtained a Cross-validation(k-fold where ke= 10 is most popular o Randomly partition the data into k mutually exclusive subsets each approximately equal size At i-th iteration, use D as test set and others as training set Leave-one-out k folds where k=# of tuples, for small sized dataHoldout & Cross-validation method ◼ Holdout method ◆ Given data is randomly partitioned into two independent sets  Training set (e.g., 2/3) for model construction  Test set (e.g., 1/3) for accuracy estimation ◆ Random sampling: a variation of holdout  Repeat holdout k times, accuracy = avg. of the accuracies obtained ◼ Cross-validation (k-fold, where k=10 is most popular) ◆ Randomly partition the data into k mutually exclusive subsets, each approximately equal size ◆ At i-th iteration, use Di as test set and others as training set ◆ Leave-one-out: k folds where k = # of tuples, for small sized data
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有