正在加载图片...
AirTyping:A Mid-Air Typing Scheme based on Leap Motion Hao Zhang Yafeng Yin State Key Laboratory for Novel Software Technology, State Key Laboratory for Novel Software Technology, Nanjing University Nanjing University Nanjing,China Nanjing,China H.Zhang@smail.nju.edu.cn yafeng@nju.edu.cn Lei Xie Sanglu Lu State Key Laboratory for Novel Software Technology, State Key Laboratory for Novel Software Technology, Nanjing University Nanjing University Nanjing,China Nanjing,China lxie@nju.edu.cn sanglu@nju.edu.cn ABSTRACT 1 INTRODUCTION In Human-Computer Interactions(HCI),to reduce the dependency The development of human activity recognition technology has of bulky devices like physical keyboards andjoysticks,many gesture- brought new ways for Human-Computer Interaction(HCD).Specifi- based HCI schemes are adopted.As a typical HCI technology,text cally,people can perform gestures with arms,hands or even fingers input has aroused much concern and many virtual or wearable key- to interact with computer/devices,without the necessity of using boards have been proposed.To further remove the keyboard and joysticks or specially designed controllers,e.g,playing motion allow people to type in a device-free way,we propose AirTyping, sensing games.As a typical HCI technology,text input has aroused i.e.,a mid-air typing scheme based on Leap Motion.During the people's attention,thus many gesture based input schemes [1][2][3] typing process,the Leap Motion Controller captures the typing are proposed to get rid of the dependency of physical keyboards. gestures with cameras and provides the coordinates of finger joints. However,the existing work tends to introduce virtual keyboard or Then,AirTyping detects the possible keystrokes,infers the typed wearable sensors for text input.To further remove the constraints words based on Bayesian method,and outputs the inputted word of keyboard and wearable sensors,we propose AirTyping,i.e.,a sequence.The experiment results show that our system can detect mid-air typing scheme based on Leap Motion,as shown in Fig.1. the keystrokes and infer the typed text efficiently,ie.,the true pos- When the user types words in mid-air over the Leap Motion Con- itive rate of keystroke detection is 92.2%.while the accuracy that troller(LMC)with standard fingering,AirTyping utilizes LMC to the top-1 inferred word is the typed word achieves 90.2%. track the coordinates of finger joints and infers the typed words for text input.AirTyping can be used in many scenarios inconvenient CCS CONCEPTS to use keyboards or required to protect the privacy of text input without a visible keyboard layout. Human-centered computing-Text input;Gestural input. However,without the keyboard layout,it is difficult to map the finger's movement with a specific keystroke,which brings the KEYWORDS challenges of keystroke detection and recognition.Specifically,con- Mid-Air Typing:Leap Motion;Human-Computer Interaction sidering that fingers not making a keystroke can also move,we introduce the bending angles of fingers,movement trend of a finger in consecutive coordinates,and time difference between keystrokes ACM Reference Format: Hao Zhang.Yafeng Yin,Lei Xie,and Sanglu Lu.2020.AirTyping:A Mid-Air to detect the most possible finger making a keystroke.Besides.con- Typing Scheme based on Leap Motion.In Adjunct Proceedings of the 2020 sidering possible wrong,false positive and false negative detected ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable metacarpal QEATYUIO P Computers (UbiComp/ISWC '20 Adjunct),September 12-16,2020,Virtual Event,Mexico.ACM,New York,NY,USA,4 pages.https://doiorg/10.1145/ 3410530.3414387 his is a leap ard dem Permission to make digital or hard copies of part or all of this work for personal or target words classroom use is granted without fee provided that copies are not made or distributed 782/82/1/93110/8837 for profit or commercial advantage and that copies bear this notice and the full citation detected fingers on the first page.Copyrights for third-party components of this work must be honored. his is a leag For all other uses,contact the owner/author(s). inferred words UbiComp/ISWC '20 Adjunct,September 12-16,2020,Virtual Event,Mexico 2020 Copyright held by the owner/author(s). ACM1SBN978-1-4503-8076-8/20/09. htps:/doi.org/10.1145/3410530.3414387 Figure 1:Mid-air typing based on Leap Motion.AirTyping: A Mid-Air Typing Scheme based on Leap Motion Hao Zhang State Key Laboratory for Novel Software Technology, Nanjing University Nanjing, China H.Zhang@smail.nju.edu.cn Yafeng Yin State Key Laboratory for Novel Software Technology, Nanjing University Nanjing, China yafeng@nju.edu.cn Lei Xie State Key Laboratory for Novel Software Technology, Nanjing University Nanjing, China lxie@nju.edu.cn Sanglu Lu State Key Laboratory for Novel Software Technology, Nanjing University Nanjing, China sanglu@nju.edu.cn ABSTRACT In Human-Computer Interactions (HCI), to reduce the dependency of bulky devices like physical keyboards and joysticks, many gesture￾based HCI schemes are adopted. As a typical HCI technology, text input has aroused much concern and many virtual or wearable key￾boards have been proposed. To further remove the keyboard and allow people to type in a device-free way, we propose AirTyping, i.e., a mid-air typing scheme based on Leap Motion. During the typing process, the Leap Motion Controller captures the typing gestures with cameras and provides the coordinates of finger joints. Then, AirTyping detects the possible keystrokes, infers the typed words based on Bayesian method, and outputs the inputted word sequence. The experiment results show that our system can detect the keystrokes and infer the typed text efficiently, i.e., the true pos￾itive rate of keystroke detection is 92.2%, while the accuracy that the top-1 inferred word is the typed word achieves 90.2%. CCS CONCEPTS • Human-centered computing → Text input; Gestural input. KEYWORDS Mid-Air Typing; Leap Motion; Human-Computer Interaction ACM Reference Format: Hao Zhang, Yafeng Yin, Lei Xie, and Sanglu Lu. 2020. AirTyping: A Mid-Air Typing Scheme based on Leap Motion. In Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers (UbiComp/ISWC ’20 Adjunct), September 12–16, 2020, Virtual Event, Mexico. ACM, New York, NY, USA, 4 pages. https://doi.org/10.1145/ 3410530.3414387 Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). UbiComp/ISWC ’20 Adjunct, September 12–16, 2020, Virtual Event, Mexico © 2020 Copyright held by the owner/author(s). ACM ISBN 978-1-4503-8076-8/20/09. https://doi.org/10.1145/3410530.3414387 1 INTRODUCTION The development of human activity recognition technology has brought new ways for Human-Computer Interaction (HCI). Specifi￾cally, people can perform gestures with arms, hands or even fingers to interact with computer/devices, without the necessity of using joysticks or specially designed controllers, e.g., playing motion sensing games. As a typical HCI technology, text input has aroused people’s attention, thus many gesture based input schemes [1][2][3] are proposed to get rid of the dependency of physical keyboards. However, the existing work tends to introduce virtual keyboard or wearable sensors for text input. To further remove the constraints of keyboard and wearable sensors, we propose AirTyping, i.e., a mid-air typing scheme based on Leap Motion, as shown in Fig. 1. When the user types words in mid-air over the Leap Motion Con￾troller (LMC) with standard fingering, AirTyping utilizes LMC to track the coordinates of finger joints and infers the typed words for text input. AirTyping can be used in many scenarios inconvenient to use keyboards or required to protect the privacy of text input without a visible keyboard layout . However, without the keyboard layout, it is difficult to map the finger’s movement with a specific keystroke, which brings the challenges of keystroke detection and recognition. Specifically, con￾sidering that fingers not making a keystroke can also move, we introduce the bending angles of fingers, movement trend of a finger in consecutive coordinates, and time difference between keystrokes to detect the most possible finger making a keystroke. Besides, con￾sidering possible wrong, false positive and false negative detected target words detected fingers inferred words Leap Motion θ this is a leap this is a leap keyboard demo distal phalanx metacarpal proximal phalanx 1 2 3 4 5 6 7 89 10 Q W E R TYU I O P Z XCV BNM A S D F GHJK L space 4 7 8 2 / 8 2 / 1 / 9 3 1 10 / 8 8 3 7 Figure 1: Mid-air typing based on Leap Motion
向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有