正在加载图片...
Device-Free Gesture Tracking Using Acoustic Signals Wei Wang Alex X.Liut+ Ke Sunt tState Key Laboratory for Novel Software Technology,Nanjing University,China +Dept.of Computer Science and Engineering,Michigan State University,USA ww@nju.edu.cn,alexliu@cse.msu.edu,samsonsunke@gmail.com ABSTRACT Device-free gesture tracking means that user hands/fingers are not Device-free gesture tracking is an enabling HCI mechanism for attached with any device.Imagine that if a smart watch has the small wearable devices because fingers are too big to control the device-free gesture tracking capability,then the user can adjust time GUI elements on such small screens,and it is also an import- in a touch-less manner as shown in Figure 1,where the clock hand ant HCI mechanism for medium-to-large size mobile devices be- follows the movement of the finger.Device-free gesture tracking cause it allows users to provide input without blocking screen view. is an enabling HCI mechanism for small wearable devices (such In this paper,we propose LLAP.a device-free gesture tracking as smart watches)because fingers are too big to control the GUI scheme that can be deployed on existing mobile devices as soft- elements on such small screens.In contrast,device-free gesture ware,without any hardware modification.We use speakers and tracking allows users to provide input by performing gestures near microphones that already exist on most mobile devices to perform a device rather than on a device.Device-free gesture tracking is device-free tracking of a hand/finger.The key idea is to use acoustic also an important HCI mechanism for medium-to-large size mobile phase to get fine-grained movement direction and movement dis- devices (such as smartphones and tablets)complementing touch tance measurements.LLAP first extracts the sound signal reflected screens because it allows users to provide inputs without block- by the moving hand/finger after removing the background sound ing screen view,which gives user better visual experience.Fur- signals that are relatively consistent over time.LLAP then meas- thermore,device-free gesture tracking can work in scenarios where ures the phase changes of the sound signals caused by hand/finger touch screens cannot,e.g.,when users wear gloves or when the movements and then converts the phase changes into the distance device is in the pocket of the movement.We implemented and evaluated LLAP using commercial-off-the-shelf mobile phones.For 1-D hand movement and 2-D drawing in the air,LLAP has a tracking accuracy of 3.5 mm and 4.6 mm,respectively.Using gesture traces tracked by LLAP,we can recognize the characters and short words drawn in the air with an accuracy of 92.3%and 91.2%,respectively. CCS Concepts ●Human-centered computing→Gestural input; Keywords Figure 1:Device-free gesture tracking Gesture Tracking;Ultrasound;Device-free Practical device-free gesture tracking systems need to satisfy three requirements.First,such systems need to have high accur- 1.INTRODUCTION acy so that they can capture delicate movements of a hand/finger. Due to the small operational space around the mobile device,e.g.. 1.1 Motivation within tens of centimeters (cm)to the device,we need millimeter Gestures are natural and user-friendly Human Computer Interac- (mm)level tracking accuracy to fully exploit the control capability tion(HCI)mechanisms for users to control their devices.Gesture of human hands.Second,such systems need to have low latency tracking allows devices to get fine-grained user input by quantit- (i.e.,respond quickly),within tens of milliseconds,to hand/finger atively measuring the movement of their hands/fingers in the air. movement without user feeling lagging responsiveness.Third,they need to have low computational cost so that they can be implemen- ted on resource constrained mobile devices. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that c 1.2 Limitations of Prior Art tion on the first page.Copyrights for components of this work owned by others than ACM must be honored.Abstracting with credit is permitted.To copy otherwise,or re- Most existing device-free gesture tracking solutions use cus- publish,to post on servers or to redistribute to lists,requires prior specific permission tomized hardware [1-4].Based on the fact that wireless signal and/or a fee.Request permissions from permissions@acm.org. changes as a hand/finger moves,Google made a customized chip MobiCom'16,October 03-07.2016.New York City.NY.USA in their Soli system that uses 60 GHz wireless signal with mm- ©2016ACM.ISBN978-1-4503-4226-1/1610..$15.00 level wavelength to track small movement of a hand/finger [1]. D0 http:/dx.doi.org/10.1145/2973750.2973764 and Teng et al.made customized directional 60 GHz transceiversDevice-Free Gesture Tracking Using Acoustic Signals Wei Wang† Alex X. Liu†‡ Ke Sun† †State Key Laboratory for Novel Software Technology, Nanjing University, China ‡Dept. of Computer Science and Engineering, Michigan State University, USA ww@nju.edu.cn, alexliu@cse.msu.edu, samsonsunke@gmail.com ABSTRACT Device-free gesture tracking is an enabling HCI mechanism for small wearable devices because fingers are too big to control the GUI elements on such small screens, and it is also an import￾ant HCI mechanism for medium-to-large size mobile devices be￾cause it allows users to provide input without blocking screen view. In this paper, we propose LLAP, a device-free gesture tracking scheme that can be deployed on existing mobile devices as soft￾ware, without any hardware modification. We use speakers and microphones that already exist on most mobile devices to perform device-free tracking of a hand/finger. The key idea is to use acoustic phase to get fine-grained movement direction and movement dis￾tance measurements. LLAP first extracts the sound signal reflected by the moving hand/finger after removing the background sound signals that are relatively consistent over time. LLAP then meas￾ures the phase changes of the sound signals caused by hand/finger movements and then converts the phase changes into the distance of the movement. We implemented and evaluated LLAP using commercial-off-the-shelf mobile phones. For 1-D hand movement and 2-D drawing in the air, LLAP has a tracking accuracy of 3.5 mm and 4.6 mm, respectively. Using gesture traces tracked by LLAP, we can recognize the characters and short words drawn in the air with an accuracy of 92.3% and 91.2%, respectively. CCS Concepts •Human-centered computing → Gestural input; Keywords Gesture Tracking; Ultrasound; Device-free 1. INTRODUCTION 1.1 Motivation Gestures are natural and user-friendly Human Computer Interac￾tion (HCI) mechanisms for users to control their devices. Gesture tracking allows devices to get fine-grained user input by quantit￾atively measuring the movement of their hands/fingers in the air. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full cita￾tion on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re￾publish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. MobiCom’16, October 03-07, 2016, New York City, NY, USA c 2016 ACM. ISBN 978-1-4503-4226-1/16/10. . . $15.00 DOI: http://dx.doi.org/10.1145/2973750.2973764 Device-free gesture tracking means that user hands/fingers are not attached with any device. Imagine that if a smart watch has the device-free gesture tracking capability, then the user can adjust time in a touch-less manner as shown in Figure 1, where the clock hand follows the movement of the finger. Device-free gesture tracking is an enabling HCI mechanism for small wearable devices (such as smart watches) because fingers are too big to control the GUI elements on such small screens. In contrast, device-free gesture tracking allows users to provide input by performing gestures near a device rather than on a device. Device-free gesture tracking is also an important HCI mechanism for medium-to-large size mobile devices (such as smartphones and tablets) complementing touch screens because it allows users to provide inputs without block￾ing screen view, which gives user better visual experience. Fur￾thermore, device-free gesture tracking can work in scenarios where touch screens cannot, e.g., when users wear gloves or when the device is in the pocket. Figure 1: Device-free gesture tracking Practical device-free gesture tracking systems need to satisfy three requirements. First, such systems need to have high accur￾acy so that they can capture delicate movements of a hand/finger. Due to the small operational space around the mobile device, e.g., within tens of centimeters (cm) to the device, we need millimeter (mm) level tracking accuracy to fully exploit the control capability of human hands. Second, such systems need to have low latency (i.e., respond quickly), within tens of milliseconds, to hand/finger movement without user feeling lagging responsiveness. Third, they need to have low computational cost so that they can be implemen￾ted on resource constrained mobile devices. 1.2 Limitations of Prior Art Most existing device-free gesture tracking solutions use cus￾tomized hardware [1–4]. Based on the fact that wireless signal changes as a hand/finger moves, Google made a customized chip in their Soli system that uses 60 GHz wireless signal with mm￾level wavelength to track small movement of a hand/finger [1], and Teng et al. made customized directional 60 GHz transceivers
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有