as 90%accuracy when training dataset exceeds 800 learning systems).Our key innovations lie in modeling the reflection of epochs,while the training accuracy reaches 98%after 2000 the finger on the tag array and extracting the reflection features learning epochs.The result indicates that our CNN model can of the finger based on the model.Through the reflection converge quickly to about 90%accuracy with fewer epochs features.we leverage the KNN method to track the finger trace and reasonable time. and the CNN model to recognize the multi-touch gestures. VII.RELATED WORK The experimental results confirm the effectiveness of RF- There have been active research efforts in gesture recogni- finger on both finger writing tracking and multi-touch gesture recognition,which achieves over 88%and 92%accuracy. tion,which can be broadly divided into two main categories: ACKNOWLEDGMENT Device-based Approaches.Previous research has shown that both the built-in motion sensors on wearable devices and This work is partially supported by National Natural Science the wearable RFID tags attached on human body can be Foundation of China under Grant Nos.61472185,61373129. utilized for gesture recognition [6,14,17].For example, 61321491,61502224;JiangSu Natural Science Foundation, ArmTrack [15]proposes to track the entire arm solely relying No.BK20151390.This work is partially supported by Col- on the smartwatch.FitCoach [6]assesses dynamic postures in laborative Innovation Center of Novel Software Technology workouts by recognizing the exercise gestures from wearable and Industrialization.This work is partially supported by the sensors.However,these methods suffer from the short life program A for Outstanding PhD candidate of Nanjing Univer- cycles due to high energy computation.RF-IDraw [17]and sity.This work is partially supported by the US National Sci- Pantomime [14]track the motion pattern of RFID tags for ence Foundation Grants CNS-1514436,CNS-1716500,CNS- gesture recognition.These approaches,however,require the 1717356 and Army Office Research Grant W911NF-17-1- 0467. tags to be attached to the finger or the passive object held by REFERENCES the user.It will reduce the user experience with the attached [1]Amplitude.https://en.wikipedia.org/wiki/Amplitude. RFID tags on human body.especially for the manipulation [2]Gesture recognition market.http://www.transparencymarketresearch. in the VR applications.Different from previous studies,we com/gesture-recognition-market.html [3]LipiTk.http://lipitk.sourceforge.net/. propose a device-free approach with a RFID tag array,which [4]H.Ding,C.Qian,J.Han,G.Wang,W.Xi,K.Zhao,and J.Zhao.Rfipad: indicates the user can perform each gesture naturally without Enabling cost-efficient and device-free in-air handwriting using passive wearing any specialized device. tags.In Proc.of IEEE ICDCS.2017. [5]D.M.Dobkin.The RF in RFID:Passive UHF RFID in Practice Device-free Approaches.As an emerging solution for gesture Newnes.2007. recognition,device-free approaches gain significant attentions [6]X.Guo,J.Liu,and Y.Chen.Fitcoach:Virtual fitness coach empowered by wearable mobile devices.In Proc.of IEEE INFOCOM,2017. in recent years.As a mature technique,camera-based ap- [7]J.Han,H.Ding.C.Qian,W.Xi,Z.Wang.Z.Jiang.L.Shangguan,and proaches,e.g.,Microsoft Kinect and LeapMotion,are able to J.Zhao.A customer behavior identification system using passive tags. extract the body or finger structure based on the computer IEEE/ACM Transactions on Networking,2016. version techniques.However,reconstructing the body or finger [8]J.Han.C.Qian,X.Wang.D.Ma,J.Zhao,W.Xi,Z.Jiang.and Z.Wang. Twins:Device-free object tracking using passive tags.IEEE/ACM structure from video streams usually incurs high computation Transactions on Networking,2016. and unexpected privacy leakage.Nowadays,several studies [9]T.Li,C.An,Z.Tian,A.T.Campbell,and X.Zhou.Human sensing try to recognize the gestures leveraging specialized signals, using visible light communication.In Proc.of ACM MobiCom.2015. [10]J.Liu,M.Chen,S.Chen,Q.Pan,and L.Chen.Tag-Compass: e.g.,WiFi [16],acoustic signal [18]and visible light [9]. Determining the spatial direction of an object with small dimensions. However,these solutions are either easily affected by the In Proc.of IEEE INFOCOM,2017. [11]J.Liu,F.Zhu,Y.Wang.X.Wang,Q.Pan,and L.Chen.RF-Scanner: ambient noise or incapable of sensing fine-grained gestures. Shelf scanning with robot-assisted RFID systems.In Proc.of IEEE Yang et al.propose to locate the human body based on INFOCOM,2017. COTS RFID technique via a device-free approach [21].which [12]X.Liu,X.Xie,K.Li,B.Xiao,J.Wu,H.Qi,and D.Lu.Fast Tracking the Population of Key Tags in Large-scale Anonymous RFID Systems. shows the potential of device-free sensing in RFID system. IEEE/ACM Transactions on Networking,2017. More recently,RF-IPad [4].another device-free approach [13]K.Pearson.Notes on regression and inheritance in the case of two based on RFID,is proposed to recognize the human writing parents.In Proc.of the Royal Sociery of London,1895. by detecting the stroke.However,we focus on tracking the [14]L.Shangguan,Z.Zhou,and K.Jamieson.Enabling gesture-based interactions with object.In Proc.of ACM Mobisys,2017 finger trace,which is a finger-level and fine-grained tracking [15]S.Shen,H.Wang,and R.R.Choudhury.I am a smartwatch and i can problem.Moreover,we are able to recognize the multi-touch track my users arm.In Proc.of ACM MobiSys,2016. gestures with a device-free approach based on RFID,which [16]S.Tan and J.Yang.Wifinger:Leveraging commodity wifi for fine- grained finger gesture recognition.In Proc.of ACM MobiHoc,2016. still remains open so far. [17]J.Wang.D.Vasisht,and D.Katabi.Rf-idraw:virtual touch screen in VIII.CONCLUSION the air using rf signals.In Proc.of ACM SIGCOMM,2015. [18]W.Wang.A.X.Liu,and K.Sun.Device-free gesture tracking using In this paper,we propose RF-finger,a device-free system to acoustic signals.In Proc.of ACM MobiCom,2016. [19]R.K.Wangsness.Electromagnetic Fields.New York.NY.USA:Wiley- track the finger writings and recognize the multi-touch gestures VCH,1986. based on COTS RFID system.RF-finger provides a practical [20]L.Yang.Y.Chen,X.-Y.Li,C.Xiao,M.Li,and Y.Liu.Tagoram:Real- solution to precisely track the fine-grained finger trace and time tracking of mobile rfid tags to high precision using cots devices. recognize multi-touch gestures,which facilitates the in-the-air In Proc.of ACM MobiCom,2014. [21]L.Yang.Q.Lin,X.Li,T.Liu,and Y.Liu.See through walls with cots operations in many smart applications (e.g.,VR/AR and IoT rfid system!In Proc.of ACM Mobicom,2015. 9as 90% accuracy when training dataset exceeds 800 learning epochs, while the training accuracy reaches 98% after 2000 learning epochs. The result indicates that our CNN model can converge quickly to about 90% accuracy with fewer epochs and reasonable time. VII. RELATED WORK There have been active research efforts in gesture recognition, which can be broadly divided into two main categories: Device-based Approaches. Previous research has shown that both the built-in motion sensors on wearable devices and the wearable RFID tags attached on human body can be utilized for gesture recognition [6, 14, 17]. For example, ArmTrack [15] proposes to track the entire arm solely relying on the smartwatch. FitCoach [6] assesses dynamic postures in workouts by recognizing the exercise gestures from wearable sensors. However, these methods suffer from the short life cycles due to high energy computation. RF-IDraw [17] and Pantomime [14] track the motion pattern of RFID tags for gesture recognition. These approaches, however, require the tags to be attached to the finger or the passive object held by the user. It will reduce the user experience with the attached RFID tags on human body, especially for the manipulation in the VR applications. Different from previous studies, we propose a device-free approach with a RFID tag array, which indicates the user can perform each gesture naturally without wearing any specialized device. Device-free Approaches. As an emerging solution for gesture recognition, device-free approaches gain significant attentions in recent years. As a mature technique, camera-based approaches, e.g., Microsoft Kinect and LeapMotion, are able to extract the body or finger structure based on the computer version techniques. However, reconstructing the body or finger structure from video streams usually incurs high computation and unexpected privacy leakage. Nowadays, several studies try to recognize the gestures leveraging specialized signals, e.g., WiFi [16], acoustic signal [18] and visible light [9]. However, these solutions are either easily affected by the ambient noise or incapable of sensing fine-grained gestures. Yang et al. propose to locate the human body based on COTS RFID technique via a device-free approach [21], which shows the potential of device-free sensing in RFID system. More recently, RF-IPad [4], another device-free approach based on RFID, is proposed to recognize the human writing by detecting the stroke. However, we focus on tracking the finger trace, which is a finger-level and fine-grained tracking problem. Moreover, we are able to recognize the multi-touch gestures with a device-free approach based on RFID, which still remains open so far. VIII. CONCLUSION In this paper, we propose RF-finger, a device-free system to track the finger writings and recognize the multi-touch gestures based on COTS RFID system. RF-finger provides a practical solution to precisely track the fine-grained finger trace and recognize multi-touch gestures, which facilitates the in-the-air operations in many smart applications (e.g., VR/AR and IoT systems). Our key innovations lie in modeling the reflection of the finger on the tag array and extracting the reflection features of the finger based on the model. Through the reflection features, we leverage the KNN method to track the finger trace and the CNN model to recognize the multi-touch gestures. The experimental results confirm the effectiveness of RF- finger on both finger writing tracking and multi-touch gesture recognition, which achieves over 88% and 92% accuracy. ACKNOWLEDGMENT This work is partially supported by National Natural Science Foundation of China under Grant Nos. 61472185, 61373129, 61321491, 61502224; JiangSu Natural Science Foundation, No. BK20151390. This work is partially supported by Collaborative Innovation Center of Novel Software Technology and Industrialization. This work is partially supported by the program A for Outstanding PhD candidate of Nanjing University. This work is partially supported by the US National Science Foundation Grants CNS-1514436, CNS-1716500, CNS- 1717356 and Army Office Research Grant W911NF-17-1- 0467. REFERENCES [1] Amplitude. https://en.wikipedia.org/wiki/Amplitude. [2] Gesture recognition market. http://www.transparencymarketresearch. com/gesture-recognition-market.html. [3] LipiTk. http://lipitk.sourceforge.net/. [4] H. Ding, C. Qian, J. Han, G. Wang, W. Xi, K. Zhao, and J. Zhao. Rfipad: Enabling cost-efficient and device-free in-air handwriting using passive tags. In Proc. of IEEE ICDCS, 2017. [5] D. M. Dobkin. The RF in RFID: Passive UHF RFID in Practice. Newnes, 2007. [6] X. Guo, J. Liu, and Y. Chen. Fitcoach: Virtual fitness coach empowered by wearable mobile devices. In Proc. of IEEE INFOCOM, 2017. [7] J. Han, H. Ding, C. Qian, W. Xi, Z. Wang, Z. Jiang, L. Shangguan, and J. Zhao. A customer behavior identification system using passive tags. IEEE/ACM Transactions on Networking, 2016. [8] J. Han, C. Qian, X. Wang, D. Ma, J. Zhao, W. Xi, Z. Jiang, and Z. Wang. Twins: Device-free object tracking using passive tags. IEEE/ACM Transactions on Networking, 2016. [9] T. Li, C. An, Z. Tian, A. T. Campbell, and X. Zhou. Human sensing using visible light communication. In Proc. of ACM MobiCom, 2015. [10] J. Liu, M. Chen, S. Chen, Q. Pan, and L. Chen. Tag-Compass: Determining the spatial direction of an object with small dimensions. In Proc. of IEEE INFOCOM, 2017. [11] J. Liu, F. Zhu, Y. Wang, X. Wang, Q. Pan, and L. Chen. RF-Scanner: Shelf scanning with robot-assisted RFID systems. In Proc. of IEEE INFOCOM, 2017. [12] X. Liu, X. Xie, K. Li, B. Xiao, J. Wu, H. Qi, and D. Lu. Fast Tracking the Population of Key Tags in Large-scale Anonymous RFID Systems. IEEE/ACM Transactions on Networking, 2017. [13] K. Pearson. Notes on regression and inheritance in the case of two parents. In Proc. of the Royal Society of London, 1895. [14] L. Shangguan, Z. Zhou, and K. Jamieson. Enabling gesture-based interactions with object. In Proc. of ACM Mobisys, 2017. [15] S. Shen, H. Wang, and R. R. Choudhury. I am a smartwatch and i can track my users arm. In Proc. of ACM MobiSys, 2016. [16] S. Tan and J. Yang. Wifinger: Leveraging commodity wifi for finegrained finger gesture recognition. In Proc. of ACM MobiHoc, 2016. [17] J. Wang, D. Vasisht, and D. Katabi. Rf-idraw: virtual touch screen in the air using rf signals. In Proc. of ACM SIGCOMM, 2015. [18] W. Wang, A. X. Liu, and K. Sun. Device-free gesture tracking using acoustic signals. In Proc. of ACM MobiCom, 2016. [19] R. K. Wangsness. Electromagnetic Fields. New York, NY, USA: WileyVCH, 1986. [20] L. Yang, Y. Chen, X.-Y. Li, C. Xiao, M. Li, and Y. Liu. Tagoram: Realtime tracking of mobile rfid tags to high precision using cots devices. In Proc. of ACM MobiCom, 2014. [21] L. Yang, Q. Lin, X. Li, T. Liu, and Y. Liu. See through walls with cots rfid system! In Proc. of ACM Mobicom, 2015. 9