正在加载图片...
39:4·N.Yu et al CSI values Target device WiFi signal Volume increase WiFi router Fig.1.QGesture system overview. accuracy of 96%[1].However,most of these systems only recognized a predefined set of gestures without consid- ering movement distance/direction measurements.WiDir used WiFi CSI to estimate the whole-body movement direction,such as walking,with an error of 10 degrees [33].For small hand movements,WiDraw used the Angle- Of-Arrival(AOA)measurement to achieve a tracking accuracy of 5 cm [25].However,the AOA-based approach also had a limited working range of fewer than 2 feet,so that it cannot be used as remote controls in HCI applications.QGesture is inspired by previous WiFi CSI processing technologies,including the noise removal algorithm,basic phase correction algorithm,and preamble gesture design.QGesture advances the state-of-art design by capturing the small phase variations caused by hand movements at a long distance.In addition to WiFi-based schemes,existing schemes also use COTS RFID readers and tags to track gestures [9,28].However, these systems require users to wear RFIDs or operate close to the RFID array,which makes them inconvenient to use. RF-based Recognition/Tracking Using Specialized Devices:RF signals can also be captured by specialized devices such as software radio systems.Software radio systems,such as USRP or WARP,have access to the fine- grained baseband signal so that they can provide the capability of quantifying hand/finger movement distance and speed [2,3,14,31,35].WiSee used USRP software radio to identify and classify nine whole-body gestures with an accuracy of 94%[24].WiTrack used specially designed Frequency-Modulated Continuous-Wave(FMCW) radar with a high bandwidth of 1.79 GHz to track human movements behind the wall with a resolution of about 11 cm to 20 cm [2,3].WiDeo used the WARP hardware to enable a tracking accuracy of 7 cm for multiple objects [14].AllSee used a specially designed analog circuit to extract the envelopes of the received signals and recognize gestures within a short distance of 2.5 feet[15].While these system provided valuable insights on the dynamics of the wireless signal,tracking with the coarse-grained CSI measurements requires a different set of signal processing algorithms. Non-RF-based Recognition/Tracking:Gesture recognition can be enabled by non-RF based technologies, including computer vision,wearable devices,and sound waves.Computer vision based gesture recognition uses cameras and infrared sensors to reconstruct the depth information from videos.The distance measurement accuracy for computer vision based solutions could be a few millimeters when the target is within one meter [32],but the depth accuracy degrades to a few centimeters for an operational range of 5 meters [16].The key limitation of computer vision based solutions is that the accuracy is highly dependent on the viewing angle and lighting conditions.Moreover,users may also have privacy concerns for video-camera-based solutions.Sound waves can be used to measure moving distances [23,38,39]or moving speeds [11].When the user is holding the Proceedings of the ACM on Human-Computer Interaction,Vol.1,No.4,Article 39.Publication date:March 2018.39:4 • N. Yu et al. Target device WiFi signal WiFi router Volume increase CSI values Fig. 1. QGesture system overview. accuracy of 96% [1]. However, most of these systems only recognized a predefined set of gestures without consid￾ering movement distance/direction measurements. WiDir used WiFi CSI to estimate the whole-body movement direction, such as walking, with an error of 10 degrees [33]. For small hand movements, WiDraw used the Angle￾Of-Arrival (AOA) measurement to achieve a tracking accuracy of 5 cm [25]. However, the AOA-based approach also had a limited working range of fewer than 2 feet, so that it cannot be used as remote controls in HCI applications. QGesture is inspired by previous WiFi CSI processing technologies, including the noise removal algorithm, basic phase correction algorithm, and preamble gesture design. QGesture advances the state-of-art design by capturing the small phase variations caused by hand movements at a long distance. In addition to WiFi-based schemes, existing schemes also use COTS RFID readers and tags to track gestures [9, 28]. However, these systems require users to wear RFIDs or operate close to the RFID array, which makes them inconvenient to use. RF-based Recognition/Tracking Using Specialized Devices: RF signals can also be captured by specialized devices such as software radio systems. Software radio systems, such as USRP or WARP, have access to the fine￾grained baseband signal so that they can provide the capability of quantifying hand/finger movement distance and speed [2, 3, 14, 31, 35]. WiSee used USRP software radio to identify and classify nine whole-body gestures with an accuracy of 94% [24]. WiTrack used specially designed Frequency-Modulated Continuous-Wave (FMCW) radar with a high bandwidth of 1.79 GHz to track human movements behind the wall with a resolution of about 11 cm to 20 cm [2, 3]. WiDeo used the WARP hardware to enable a tracking accuracy of 7 cm for multiple objects [14]. AllSee used a specially designed analog circuit to extract the envelopes of the received signals and recognize gestures within a short distance of 2.5 feet [15]. While these system provided valuable insights on the dynamics of the wireless signal, tracking with the coarse-grained CSI measurements requires a different set of signal processing algorithms. Non-RF-based Recognition/Tracking: Gesture recognition can be enabled by non-RF based technologies, including computer vision, wearable devices, and sound waves. Computer vision based gesture recognition uses cameras and infrared sensors to reconstruct the depth information from videos. The distance measurement accuracy for computer vision based solutions could be a few millimeters when the target is within one meter [32], but the depth accuracy degrades to a few centimeters for an operational range of 5 meters [16]. The key limitation of computer vision based solutions is that the accuracy is highly dependent on the viewing angle and lighting conditions. Moreover, users may also have privacy concerns for video-camera-based solutions. Sound waves can be used to measure moving distances [23, 38, 39] or moving speeds [11]. When the user is holding the Proceedings of the ACM on Human-Computer Interaction, Vol. 1, No. 4, Article 39. Publication date: March 2018
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有