1:4 Y.Yin et al. In-air gesture recognition:Parate et al.[26]design a mobile solution called RisQ to detect smoking gestures and sessions with a wristband and use a machine learning pipeline to process sensor data.Blank et al.[7]present a system for table tennis stroke detection and classification by attaching inertial sensors to table tennis rackets.Thomaz et al.[31]describe the implementation and evaluation of an approach to infer eating moments using a 3-axis accelerometer in smartwatch.Xu et al.[35]build a classifier to identify user's hand and finger gestures utilizing the essential features of accelerometer and gyroscope data measured from smartwatch.Huang et al.[18]build a system to monitor the brushing quality using a manual toothbrush modified by attaching small magnets to the handle and an off-the-shelf smart watch.These approaches typically extract features from sensor data and apply machine learning techniques for gesture recognition. In-air gesture tracking:Zhou et al.[42][44][43]utilize a kinematic chain to track the human upper limb motion by placing multiple devices on the arm.Cutti et al.[11]utilize the joint angles to track the movements of upper limbs by placing sensors on the chest,shoulder, arm,and wrist.Chen et al.[8]design a wearable system consisting of a pair of magnetometers on fingers and a permanent magnet affixed to the thumb,and introduce uTrack to convert the thumb and fingers into a continuous input system (e.g.,3D pointing).Shen et al.[29] utilize the 5-DoF arm model and HMM to track the 3D posture of the arm,using both motion and magnetic sensors in smartwatch.In fact,accurate in-air gesture tracking in real time can be very challenging.Besides,obtaining the 3D moving trajectory does not mean recognizing the in-air gesture.In this paper,we do not require accurate trajectory tracking, while aiming to obtain the gesture contour and recognize it as a character. Writing in the air:Zhang et al.[39]quantify data into small integral vectors based on acceleration orientation,and then use HMM to recognize 10 Arabic numerals.Wang et al.[32] present IMUPEN to reconstruct motion trajectory and recognize handwritten digits.Bashir et al.[6]use a pen equipped with inertial sensors and apply DTW to recognize handwritten characters.Agrawal et al.[1]recognize handwritten capital letters and Arabic numerals in a 2D plane based on strokes and a grammar tree,by using the built-in accelerometer in smartphone.Amma et al.[2]design a glove equipped with inertial sensors,and use SVM, HMM and statistical language model to recognize capital letters,sentences,etc.Deselaers et al.[13]present GyroPen to reconstruct the writing path for pen-like interaction.Xu et al.[36] utilize the continuous density HMM and Viterbi algorithm to recognize handwritten digits and letters using inertial sensors.In this paper,we focus on single in-air character recognition without the assistance of language model.For a character,we do not define specific strokes or require pen-up for stroke segmentation,while tolerating the intra-class variability caused by writing speeds,gesture sizes,writing directions and observation ambiguity caused by viewing angles etc in 3D space. Handwritten character recognition:In addition to inertial sensor-based approaches, many image processing techniques [31416 are also adopted for recognizing handwritten characters in a 2D plane (i.e.,image).Bahlmann et al.[4]combine DTW and SVMs to establish a Gaussian DTW(GDTW)kernel for on-line recognition of UNIPEN handwriting data.Rayar et al.[28]propose preselection method for CNN-based classification and evaluate it in handwritten character recognition in images.Rao et al.[27 propose a newly designed network structure based on an extended nonlinear kernel residual network,to recognize the handwritten characters over MINIST and SVHN dataset.These approaches focus on recognizing hand-moving trajectories in a 2D plane,while our paper focuses on transforming the 3D gesture into a proper 2D contour,and then utilizes the contour's space-time feature to recognize contours as characters. ACM Trans.Sensor Netw.,Vol.1,No.1,Article 1.Publication date:January 2019.1:4 Y. Yin et al. In-air gesture recognition: Parate et al. [26] design a mobile solution called RisQ to detect smoking gestures and sessions with a wristband and use a machine learning pipeline to process sensor data. Blank et al. [7] present a system for table tennis stroke detection and classification by attaching inertial sensors to table tennis rackets. Thomaz et al. [31] describe the implementation and evaluation of an approach to infer eating moments using a 3-axis accelerometer in smartwatch. Xu et al. [35] build a classifier to identify user’s hand and finger gestures utilizing the essential features of accelerometer and gyroscope data measured from smartwatch. Huang et al. [18] build a system to monitor the brushing quality using a manual toothbrush modified by attaching small magnets to the handle and an off-the-shelf smart watch. These approaches typically extract features from sensor data and apply machine learning techniques for gesture recognition. In-air gesture tracking: Zhou et al. [42][44][43] utilize a kinematic chain to track the human upper limb motion by placing multiple devices on the arm. Cutti et al. [11] utilize the joint angles to track the movements of upper limbs by placing sensors on the chest, shoulder, arm, and wrist. Chen et al. [8] design a wearable system consisting of a pair of magnetometers on fingers and a permanent magnet affixed to the thumb, and introduce uTrack to convert the thumb and fingers into a continuous input system (e.g., 3D pointing). Shen et al. [29] utilize the 5-DoF arm model and HMM to track the 3D posture of the arm, using both motion and magnetic sensors in smartwatch. In fact, accurate in-air gesture tracking in real time can be very challenging. Besides, obtaining the 3D moving trajectory does not mean recognizing the in-air gesture. In this paper, we do not require accurate trajectory tracking, while aiming to obtain the gesture contour and recognize it as a character. Writing in the air: Zhang et al. [39] quantify data into small integral vectors based on acceleration orientation, and then use HMM to recognize 10 Arabic numerals. Wang et al. [32] present IMUPEN to reconstruct motion trajectory and recognize handwritten digits. Bashir et al. [6] use a pen equipped with inertial sensors and apply DTW to recognize handwritten characters. Agrawal et al. [1] recognize handwritten capital letters and Arabic numerals in a 2D plane based on strokes and a grammar tree, by using the built-in accelerometer in smartphone. Amma et al. [2] design a glove equipped with inertial sensors, and use SVM, HMM and statistical language model to recognize capital letters, sentences, etc. Deselaers et al. [13] present GyroPen to reconstruct the writing path for pen-like interaction. Xu et al. [36] utilize the continuous density HMM and Viterbi algorithm to recognize handwritten digits and letters using inertial sensors. In this paper, we focus on single in-air character recognition without the assistance of language model. For a character, we do not define specific strokes or require pen-up for stroke segmentation, while tolerating the intra-class variability caused by writing speeds, gesture sizes, writing directions and observation ambiguity caused by viewing angles etc in 3D space. Handwritten character recognition: In addition to inertial sensor-based approaches, many image processing techniques [3][14][16] are also adopted for recognizing handwritten characters in a 2D plane (i.e., image). Bahlmann et al. [4] combine DTW and SVMs to establish a Gaussian DTW (GDTW) kernel for on-line recognition of UNIPEN handwriting data. Rayar et al. [28] propose preselection method for CNN-based classification and evaluate it in handwritten character recognition in images. Rao et al. [27] propose a newly designed network structure based on an extended nonlinear kernel residual network, to recognize the handwritten characters over MINIST and SVHN dataset. These approaches focus on recognizing hand-moving trajectories in a 2D plane, while our paper focuses on transforming the 3D gesture into a proper 2D contour, and then utilizes the contour’s space-time feature to recognize contours as characters. ACM Trans. Sensor Netw., Vol. 1, No. 1, Article 1. Publication date: January 2019