正在加载图片...
6002 IEEE TRANSACTIONS ON IMAGE PROCESSING,VOL.27.NO.12,DECEMBER 2018 TABLE IV as input and learns binary codes by maximizing the EXAMPLE POINTS OF THE DATASETS discriminability of the corresponding binary codes. Deep pairwise-supervised hashing (DPSH)[42]:DPSH Dataset Example Label is a deep supervised hashing method.DPSH performs 尼腰 “frog” deep feature learning and hash-code learning simulta- CIFAR-10 neously with pairwise labels by minimizing negative 加题 "deer". log-likelihood of the observed pairwise labels. 雷妈 "tuck” Deep supervised discrete hashing (DSDH)[44]:DSDH is a deep supervised hashing method.Similar to NDH, 222 2” DSDH also utilizes both pointwise label and pairwise similarity to learn binary codes and deep neural net- SVHN 3331 “3” work.Furthermore,DSDH adopts the method of auxiliary “9” coordinates(MAC)technique in AFFHash [49]to bridge binary coding procedure and feature learning procedure. Among all these baselines,LSH is a data-independent hash- “person”,“sky" ing method.ITQ is an unsupervised hashing method.LFH, FastH.COSDISH,and SDH are non-deep methods,which can- not perform deep feature learning.LFH is a relaxation-based NUS-WIDE "clouds'".“ocean'”, person”,“sky”,"water'” method.FastH,COSDISH and SDH are discrete supervised hashing methods.NDH is a hand-crafted feature based deep “road",“clouds'" supervised hashing method.DHN,DSH,and DPSH are deep "sky”,buildings”. hashing methods with pairwise labels which can perform feature learning and hash-code learning simultaneously.DSDH is a deep supervised hashing method which utilizes both wT-shirt'”. pointwise label and pairwise similarity. For fair comparison,all feature learning based deep hashing ClothingIM methods,including deep baselines and our DDSH,adopt the same pre-trained CNN-F model on ImageNet6 for feature learning.Because the CNN-F model is pre-trained with images "shawl". of size 224 x 224 pixels,we first resize all images to be 224 x 224 pixels for four datasets.Then the raw image pixels are directly utilized as input for deep hashing methods. We carefully implement DHN and DSH on MatConvNet. We fix the mini-batch size to be 128 and tune the learning rate from 10-6 to 10-2 by using a cross-validation strategy. Latent factor hashing (LFH)[33]:LFH is a supervised Furthermore,we set weight decay as 5 x 10-4 to avoid hashing method which tries to learn binary code based overfitting.For DDSH,we set =100,Tour 3 and on latent factor models. Tin 50 for CIFAR-10,SVHN and NUS-WIDE datasets. Fast supervised hashing(FastH)[23]:FastH is supervised Because NUS-WIDE is a multi-label dataset,we reduce the hashing method.FastH directly adopts graph-cut method similarity weight for those training points with multi labels to learn discrete binary code. when we train DDSH.For ClothinglM dataset,we set Supervised discrete hashing(SDH)[25]:SDH is a point- 500,Tout =10 and Tin =15. wise supervised hashing method which utilizes the dis- For other supervised hashing methods,including LFH,ITQ, crete cyclic coordinate descent(DCC)algorithm to learn LFH,FastH,SDH,COSDISH and NDH,we use 4,096-dim discrete hash code. deep features extracted by the CNN-F model pre-trained Column sampling based discrete supervised hash- on ImageNet as input for fair comparison.Because SDH ing(COSDISH)[34]:COSDISH is a supervised hashing is a kernel-based methods,we randomly sample 1,000 data method.COSDISH can directly learn discrete hash code. points as anchors to construct the kernel by following the Nonlinear deep hashing (NDH)[17]:NDH is a deep suggestion of Shen et al.[25]of SDH.For LFH,FastH and supervised hashing method.NDH utilizes both pointwise COSDISH,we utilize boosted decision tree for out-of-sample label and pairwise similarity to guide binary coding and extension by following the setting of FastH.For NDH whose hash-function learning.However,NDH is a hand-crafted source code is not available,we carefully re-implement its feature based method. algorithm by ourselves.Following the authors'suggestion, Deep hashing network(DHN)[43]:DHN is a deep super- we use 200-dimension feature vector derived from PCA on vised hashing method.DHN minimizes both pairwise deep features for NDH. cross-entropy loss and pairwise quantization loss. Deep supervised hashing (DSH)[40]:DSH is a deep 6We download the CNN-F model pre-trained on ImageNet supervised hashing method.DSH takes pairs of points from http://www.vlfeat.org/matconvnet/pretrained/.6002 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 27, NO. 12, DECEMBER 2018 TABLE IV EXAMPLE POINTS OF THE DATASETS • Latent factor hashing (LFH) [33]: LFH is a supervised hashing method which tries to learn binary code based on latent factor models. • Fast supervised hashing (FastH) [23]: FastH is supervised hashing method. FastH directly adopts graph-cut method to learn discrete binary code. • Supervised discrete hashing (SDH) [25]: SDH is a point￾wise supervised hashing method which utilizes the dis￾crete cyclic coordinate descent (DCC) algorithm to learn discrete hash code. • Column sampling based discrete supervised hash￾ing (COSDISH) [34]: COSDISH is a supervised hashing method. COSDISH can directly learn discrete hash code. • Nonlinear deep hashing (NDH) [17]: NDH is a deep supervised hashing method. NDH utilizes both pointwise label and pairwise similarity to guide binary coding and hash-function learning. However, NDH is a hand-crafted feature based method. • Deep hashing network (DHN) [43]: DHN is a deep super￾vised hashing method. DHN minimizes both pairwise cross-entropy loss and pairwise quantization loss. • Deep supervised hashing (DSH) [40]: DSH is a deep supervised hashing method. DSH takes pairs of points as input and learns binary codes by maximizing the discriminability of the corresponding binary codes. • Deep pairwise-supervised hashing (DPSH) [42]: DPSH is a deep supervised hashing method. DPSH performs deep feature learning and hash-code learning simulta￾neously with pairwise labels by minimizing negative log-likelihood of the observed pairwise labels. • Deep supervised discrete hashing (DSDH) [44]: DSDH is a deep supervised hashing method. Similar to NDH, DSDH also utilizes both pointwise label and pairwise similarity to learn binary codes and deep neural net￾work. Furthermore, DSDH adopts the method of auxiliary coordinates (MAC) technique in AFFHash [49] to bridge binary coding procedure and feature learning procedure. Among all these baselines, LSH is a data-independent hash￾ing method. ITQ is an unsupervised hashing method. LFH, FastH, COSDISH, and SDH are non-deep methods, which can￾not perform deep feature learning. LFH is a relaxation-based method. FastH, COSDISH and SDH are discrete supervised hashing methods. NDH is a hand-crafted feature based deep supervised hashing method. DHN, DSH, and DPSH are deep hashing methods with pairwise labels which can perform feature learning and hash-code learning simultaneously. DSDH is a deep supervised hashing method which utilizes both pointwise label and pairwise similarity. For fair comparison, all feature learning based deep hashing methods, including deep baselines and our DDSH, adopt the same pre-trained CNN-F model on ImageNet6 for feature learning. Because the CNN-F model is pre-trained with images of size 224 × 224 pixels, we first resize all images to be 224 × 224 pixels for four datasets. Then the raw image pixels are directly utilized as input for deep hashing methods. We carefully implement DHN and DSH on MatConvNet. We fix the mini-batch size to be 128 and tune the learning rate from 10−6 to 10−2 by using a cross-validation strategy. Furthermore, we set weight decay as 5 × 10−4 to avoid overfitting. For DDSH, we set || = 100, Tout = 3 and Tin = 50 for CIFAR-10, SVHN and NUS-WIDE datasets. Because NUS-WIDE is a multi-label dataset, we reduce the similarity weight for those training points with multi labels when we train DDSH. For Clothing1M dataset, we set || = 500, Tout = 10 and Tin = 15. For other supervised hashing methods, including LFH, ITQ, LFH, FastH, SDH, COSDISH and NDH, we use 4,096-dim deep features extracted by the CNN-F model pre-trained on ImageNet as input for fair comparison. Because SDH is a kernel-based methods, we randomly sample 1,000 data points as anchors to construct the kernel by following the suggestion of Shen et al. [25] of SDH. For LFH, FastH and COSDISH, we utilize boosted decision tree for out-of-sample extension by following the setting of FastH. For NDH whose source code is not available, we carefully re-implement its algorithm by ourselves. Following the authors’ suggestion, we use 200-dimension feature vector derived from PCA on deep features for NDH. 6We download the CNN-F model pre-trained on ImageNet from http://www.vlfeat.org/matconvnet/pretrained/.
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有