RF-Brush:3D Human-Computer Interaction via Linear Tag Array Yinyin Gong,Lei Xie,Chuyu Wang,Yanling Bu,Sanglu Lu State Key Laboratory for Novel Software Technology,Nanjing University Email:[yygong,wangcyu217)@dislab.nju.edu.cn,[Ixie,sanglu)@nju.edu.cn,yanling@smail.nju.edu.cn Abstract-Nowadays,novel approaches of 3D human-computer interaction have enabled the capability of manipulating in the 3D Tag 3D Orientation Array space rather than 2D space.For example,Microsoft Surface Pen leverages the embedded sensors to sense the 3D manipulations such as inclining the pen to get bolder handwriting.In this paper,we propose RF-Brush,a battery-free and light-weight solution for 3D human-computer interaction based on RFID,by simply attaching a linear RFID tag array onto the linear shaped object like a brush.RF-Brush senses the 3D orientation and 2D movement of the linear shaped object,when the human subject is drawing with this object in the 3D space.Here,the 3D orientation 21 Operating Plane refers to the relative orientation of the linear shaped object to the operating plane,whereas the 2D movement refers to the moving trace in the 2D operating plane.In this way,we are able to Fig.1:Illustration of applications scenario of RF-Brush. transform an ordinary linear shaped object like a brush or pen need extensive training process.They either are sensitive to to an intelligent HCI device.Particularly,we build two geometric the change of background and light condition,or only pro- models to depict the relationship between the RF-signal and the 3D orientation as well as 2D movement,respectively.Based vide coarse-grained classification manipulation.Sensor-based on the geometric model,we propose the linear tag array-based approaches [6]usually can accurately track the motion of the HCI solution,implemented a prototype system,and evaluated device in the 3D space based on the embedded IMU.But they the performance in real environment.The experiments show usually suffer from the limited battery life.As for RFID-based that RF-Brush achieves an average error of 5.7 and 8.6 of elevation and azimuthal angle,respectively,and an average error approaches [7].[8].most of them are focusing on interacting of 3.8cm and 4.2cm in movement tracking along X-axis and Y. with computer in 2D space.Therefore,according to the above axis,respectively.Moreover,RF-Brush achieves 89%in letter considerations,in this paper,we aim to provide a battery-free recognition accuracy. and more natural 3D human-computer interaction by tracking I.INTRODUCTION the fine-grained 3D motion based on cheap ubiquitous objects. With the ubiquity of sensing techniques,human-computer As Fig.I shows,we interact with computer not only by the interaction (HCl)has drawn more and more attention recently, moving trace in 2D space,but also the orientation.Here,the and novel approaches are required to provide a more natural orientation is able to reflect the thickness of lines,and the 2D interaction.Currently,due to the wide utilization of the smart moving trace reflects what we write finally devices (e.g.,smart phone),the touch screen has become the With the development of RFID technique,RFID tag has most popular HCI interaction approach.However,by using the been widely regarded as a light-weight and battery-less sensor finger or point-touch pen for manipulation,the touch screen for locating and sensing in recent years.In this paper,we only provides a 2D interaction with the devices,which may propose RF-Brush,a novel system based on linear tag array prevent the user from interacting with the computer via a attached onto linear shaped object,which can provide both the 3D manner.For example,people may intend to incline the 2D movement and the 3D orientation information of tagged point-touch pen to draw a bolder line,which is a common object.Here,the 2D movement refers to the moving trace manipulation for a real pen.But based on a traditional point-in 2D operating plane,whereas the 3D orientation refers to touch pen,people need to separately configure for the thick- the relative orientation of the object to the operating plane. ness of lines,which makes the manipulation tedious.Some The basic idea is to leverage the phase change of the linear dedicated devices,e.g.,Microsoft Surface Pen,can leverage tag array during the process of the 3D motion to recover the the embedded sensors to sense the 3D manipulations,such 2D movement in 2D operating plane and the 3D orientation as inclining the pen to get bolder handwriting,but they are in 3D environment.Particularly,we first extract the phase usually expensive and only work on specific smart devices.difference between different tags at the same time point,which Existing HCI approaches can be divided into four main can be used for the 3D orientation estimation via our proposed groups:computer vision-based,sensor-based,wifi-based and geometrical model.Moreover,we build another geometrical RFID-based.Computer vision-based approaches [1]-[3]and model,which can accurately track the 2D movement of the wireless techniques [4].[5],e.g.,WiFi or FMCW usually linear shaped object's tip based on the phase variation between
RF-Brush: 3D Human-Computer Interaction via Linear Tag Array Yinyin Gong, Lei Xie, Chuyu Wang, Yanling Bu, Sanglu Lu State Key Laboratory for Novel Software Technology, Nanjing University Email: {yygong,wangcyu217}@dislab.nju.edu.cn, {lxie,sanglu}@nju.edu.cn, yanling@smail.nju.edu.cn Abstract—Nowadays, novel approaches of 3D human-computer interaction have enabled the capability of manipulating in the 3D space rather than 2D space. For example, Microsoft Surface Pen leverages the embedded sensors to sense the 3D manipulations, such as inclining the pen to get bolder handwriting. In this paper, we propose RF-Brush, a battery-free and light-weight solution for 3D human-computer interaction based on RFID, by simply attaching a linear RFID tag array onto the linear shaped object like a brush. RF-Brush senses the 3D orientation and 2D movement of the linear shaped object, when the human subject is drawing with this object in the 3D space. Here, the 3D orientation refers to the relative orientation of the linear shaped object to the operating plane, whereas the 2D movement refers to the moving trace in the 2D operating plane. In this way, we are able to transform an ordinary linear shaped object like a brush or pen to an intelligent HCI device. Particularly, we build two geometric models to depict the relationship between the RF-signal and the 3D orientation as well as 2D movement, respectively. Based on the geometric model, we propose the linear tag array-based HCI solution, implemented a prototype system, and evaluated the performance in real environment. The experiments show that RF-Brush achieves an average error of 5.7 ◦ and 8.6 ◦ of elevation and azimuthal angle, respectively, and an average error of 3.8cm and 4.2cm in movement tracking along X-axis and Yaxis, respectively. Moreover, RF-Brush achieves 89% in letter recognition accuracy. I. INTRODUCTION With the ubiquity of sensing techniques, human-computer interaction (HCI) has drawn more and more attention recently, and novel approaches are required to provide a more natural interaction. Currently, due to the wide utilization of the smart devices (e.g., smart phone), the touch screen has become the most popular HCI interaction approach. However, by using the finger or point-touch pen for manipulation, the touch screen only provides a 2D interaction with the devices, which may prevent the user from interacting with the computer via a 3D manner. For example, people may intend to incline the point-touch pen to draw a bolder line, which is a common manipulation for a real pen. But based on a traditional pointtouch pen, people need to separately configure for the thickness of lines, which makes the manipulation tedious. Some dedicated devices, e.g., Microsoft Surface Pen, can leverage the embedded sensors to sense the 3D manipulations, such as inclining the pen to get bolder handwriting, but they are usually expensive and only work on specific smart devices. Existing HCI approaches can be divided into four main groups:computer vision-based, sensor-based, wifi-based and RFID-based. Computer vision-based approaches [1]–[3] and wireless techniques [4], [5], e.g., WiFi or FMCW usually Z X 𝜷𝒂 Tag Array Y Operating Plane 𝜷𝒂 2D Movement 3D Orientation Fig. 1: Illustration of applications scenario of RF-Brush. need extensive training process. They either are sensitive to the change of background and light condition, or only provide coarse-grained classification manipulation. Sensor-based approaches [6] usually can accurately track the motion of the device in the 3D space based on the embedded IMU. But they usually suffer from the limited battery life. As for RFID-based approaches [7], [8], most of them are focusing on interacting with computer in 2D space. Therefore, according to the above considerations, in this paper, we aim to provide a battery-free and more natural 3D human-computer interaction by tracking the fine-grained 3D motion based on cheap ubiquitous objects. As Fig.1 shows, we interact with computer not only by the moving trace in 2D space, but also the orientation. Here, the orientation is able to reflect the thickness of lines, and the 2D moving trace reflects what we write finally. With the development of RFID technique, RFID tag has been widely regarded as a light-weight and battery-less sensor for locating and sensing in recent years. In this paper, we propose RF-Brush, a novel system based on linear tag array attached onto linear shaped object, which can provide both the 2D movement and the 3D orientation information of tagged object. Here, the 2D movement refers to the moving trace in 2D operating plane, whereas the 3D orientation refers to the relative orientation of the object to the operating plane. The basic idea is to leverage the phase change of the linear tag array during the process of the 3D motion to recover the 2D movement in 2D operating plane and the 3D orientation in 3D environment. Particularly, we first extract the phase difference between different tags at the same time point, which can be used for the 3D orientation estimation via our proposed geometrical model. Moreover, we build another geometrical model, which can accurately track the 2D movement of the linear shaped object’s tip based on the phase variation between
consecutive time points of the same tag.Finally,combining and Y-axis respectively in the 2D movement evaluation. both the 3D orientation and 2D movement,we can recover II.RELATED WORK the 3D motion of the linear shaped object,which can further facilitate the interaction with the computers. RFID-based Trajectory Tracking:State-of-the-art systems Realizing such RFID-based 3D human-computer interaction usually take advantage of the phase variation from RFID via linear tag array entails three key challenges.1)How to tags of consecutive time points to accurately track the 2D track the 2D movement and 3D orientation simultaneously? movement [7].[9]-121.Representative work such as Tago- Since the 3D motion of the linear shaped object can lead ram [13]realizes the real-time 2D movement tracking of the to the 2D movement and 3D rotation simultaneously,and if tagged object by estimating the absolute location of the tag we are tracking some part of the object,the rotation of the at every time point based on multiple fixed RFID antennas. object may also affect the moving trajectory.For example, Pantomime [8]leverages the tag array to track the trajectory the rotation of the object can lead to the movement of tip of the moving object based on a single antenna,which can in the operating plane.The RF phase is changing with both reduce the cost of the multiple fixed RFID antennas.However. of the two kinds of motions at the same time.Therefore,it is all of these existing works regard the object as a point, challenging to decompose the two kinds of phase change from which indicates the displacement of object is equal to the one tag for 3D motion tracking.To handle the problem,we displacement of tag.Besides,they only focus on the 2D utilize a linear tag array for RF phase measurement,and build human-computer interaction,which cannot satisfy the user's two models to depict the relationship between the two kinds of demand in 3D human-computer interaction.Different from motions and RF phase from both the time and space domain. these work,we need to track the 3D motion of the linear 2)How to deal with the diversity caused by tag's orientation? shaped objects,which contains both the 3D orientation and Since the 3D motion of linear shaped object can unavoidably the 2D movement. change the orientation of the tag array.the change of tag RFID-based Orientation Tracking:Recently,several s- orientation will also lead to the phase offset,which is later tudies are proposed to estimate the orientation of the objects studied in section III.To address this challenge,we deploy the based on the RFID technique [14]-[16].Tagyro [16]tracks tag array linearly,so when we calculate the phase difference the 3D orientation of the object by attaching the tag array and between tags,the diversity caused by the tag's orientation is models the relationship between the orientation and the phase thus canceled.due to each tag has the same orientation.3)How offset.Different from Tagyro,RF-Brush need to track both the to improve the robustness of our system to real environment? 3D orientation and 2D trajectory simultaneously by attaching a Since the RF signal is sensitive to the influence such as multi- linear tag array on the linear shaped object.Therefore we also path effect and mutual interference,it is challenging to be need to study the relationship between phase and trajectory robust in the real environments.To address this challenge,on based on the orientation information.Other work such as one hand,we make use of redundant tags attached on the linear PolarDraw [17]reconstructs the handwriting by estimating the shaped object in space domain,and estimate the 3D motion azimuthal angle,so that to track the 2D movement with a of the linear shaped object based on the RF signal of all the single tag,which leverages the RSS and phase trend based tags,which can calibrate the signal noise from part of the tags. on two linearly polarized antennas.In comparison,RF-Brush On the other hand,we leverage the motion continuity,and utilizes the phase received by circularly polarized antenna and set constraints for each motion estimation based on previous provides both the features of 3D orientation and 2D movement estimated result in time domain. which can used to interact with computer in 3D level. We make three key contributions in this paper.First,we propose a novel system for 3D human-computer interaction, III.PRELIMINARIES which not only provides the 2D movement,but also the 3D The RF phase is widely used for 2D localization and orientation of object.By attaching a linear tag array onto the tracking in previous works [13],[18].and has been validat- surface of linear shaped object,we are able to convert the ed to be an efficient wireless attribute for mobile sensing. object to an intelligent HCI device.Second,we build two Theoretically,it represents the degree that the received signal geometric models to depict the relationship between RF-signal offsets from the sender,ranging from 0 to 2,and thus and 3D orientation as well as 2D movement,which study the is determined by the transmitting distance.In a real RFID fundamental features in RFID sensing system and facilitate system,the manufacturing technique of both the reader and great quantity applications of 3D human-computer interaction. tag also introduce some static phase offset,which is usually Based on the proposed geometric models,we are able to regarded as the device diversity of phase. estimate the 3D orientation and 2D movement accurately and In addition to the static device diversity,the RF phase simultaneously.Third,we implemented a prototype system of is also affected by the dynamic status of each tag.Since RF-Brush with COTS RFID and examined its performance the 3D motion of tagged object can unavoidably lead to the in the real environments.The experiments show that the our change of tag orientation in the 3D space,we thus conduct system achieved an average error of 5.7 and 8.6 of elevation experiments to study the influence of tag orientation on the angle and azimuthal angle in 3D orientation evaluation respec- phase as illustrated in Fig.2.Particularly,we rotate one RFID tively,and an average error of 3.8cm and 4.2cm along X-axis tag 180 at a fixed point along three different axes in front
consecutive time points of the same tag. Finally, combining both the 3D orientation and 2D movement, we can recover the 3D motion of the linear shaped object, which can further facilitate the interaction with the computers. Realizing such RFID-based 3D human-computer interaction via linear tag array entails three key challenges. 1) How to track the 2D movement and 3D orientation simultaneously? Since the 3D motion of the linear shaped object can lead to the 2D movement and 3D rotation simultaneously, and if we are tracking some part of the object, the rotation of the object may also affect the moving trajectory. For example, the rotation of the object can lead to the movement of tip in the operating plane. The RF phase is changing with both of the two kinds of motions at the same time. Therefore, it is challenging to decompose the two kinds of phase change from one tag for 3D motion tracking. To handle the problem, we utilize a linear tag array for RF phase measurement, and build two models to depict the relationship between the two kinds of motions and RF phase from both the time and space domain. 2) How to deal with the diversity caused by tag’s orientation? Since the 3D motion of linear shaped object can unavoidably change the orientation of the tag array, the change of tag orientation will also lead to the phase offset, which is later studied in section III. To address this challenge, we deploy the tag array linearly, so when we calculate the phase difference between tags, the diversity caused by the tag’s orientation is thus canceled, due to each tag has the same orientation. 3) How to improve the robustness of our system to real environment? Since the RF signal is sensitive to the influence such as multipath effect and mutual interference, it is challenging to be robust in the real environments. To address this challenge, on one hand, we make use of redundant tags attached on the linear shaped object in space domain, and estimate the 3D motion of the linear shaped object based on the RF signal of all the tags, which can calibrate the signal noise from part of the tags. On the other hand, we leverage the motion continuity, and set constraints for each motion estimation based on previous estimated result in time domain. We make three key contributions in this paper. First, we propose a novel system for 3D human-computer interaction, which not only provides the 2D movement, but also the 3D orientation of object. By attaching a linear tag array onto the surface of linear shaped object, we are able to convert the object to an intelligent HCI device. Second, we build two geometric models to depict the relationship between RF-signal and 3D orientation as well as 2D movement, which study the fundamental features in RFID sensing system and facilitate great quantity applications of 3D human-computer interaction. Based on the proposed geometric models, we are able to estimate the 3D orientation and 2D movement accurately and simultaneously. Third, we implemented a prototype system of RF-Brush with COTS RFID and examined its performance in the real environments. The experiments show that the our system achieved an average error of 5.7 ◦ and 8.6 ◦ of elevation angle and azimuthal angle in 3D orientation evaluation respectively, and an average error of 3.8cm and 4.2cm along X-axis and Y-axis respectively in the 2D movement evaluation. II. RELATED WORK RFID-based Trajectory Tracking: State-of-the-art systems usually take advantage of the phase variation from RFID tags of consecutive time points to accurately track the 2D movement [7], [9]–[12]. Representative work such as Tagoram [13] realizes the real-time 2D movement tracking of the tagged object by estimating the absolute location of the tag at every time point based on multiple fixed RFID antennas. Pantomime [8] leverages the tag array to track the trajectory of the moving object based on a single antenna, which can reduce the cost of the multiple fixed RFID antennas. However, all of these existing works regard the object as a point, which indicates the displacement of object is equal to the displacement of tag. Besides, they only focus on the 2D human-computer interaction, which cannot satisfy the user’s demand in 3D human-computer interaction. Different from these work, we need to track the 3D motion of the linear shaped objects, which contains both the 3D orientation and the 2D movement. RFID-based Orientation Tracking: Recently, several studies are proposed to estimate the orientation of the objects based on the RFID technique [14]–[16]. Tagyro [16] tracks the 3D orientation of the object by attaching the tag array and models the relationship between the orientation and the phase offset. Different from Tagyro, RF-Brush need to track both the 3D orientation and 2D trajectory simultaneously by attaching a linear tag array on the linear shaped object. Therefore we also need to study the relationship between phase and trajectory based on the orientation information. Other work such as PolarDraw [17] reconstructs the handwriting by estimating the azimuthal angle, so that to track the 2D movement with a single tag, which leverages the RSS and phase trend based on two linearly polarized antennas. In comparison, RF-Brush utilizes the phase received by circularly polarized antenna and provides both the features of 3D orientation and 2D movement which can used to interact with computer in 3D level. III. PRELIMINARIES The RF phase is widely used for 2D localization and tracking in previous works [13], [18], and has been validated to be an efficient wireless attribute for mobile sensing. Theoretically, it represents the degree that the received signal offsets from the sender, ranging from 0 to 2π, and thus is determined by the transmitting distance. In a real RFID system, the manufacturing technique of both the reader and tag also introduce some static phase offset, which is usually regarded as the device diversity of phase. In addition to the static device diversity, the RF phase is also affected by the dynamic status of each tag. Since the 3D motion of tagged object can unavoidably lead to the change of tag orientation in the 3D space, we thus conduct experiments to study the influence of tag orientation on the phase as illustrated in Fig.2. Particularly, we rotate one RFID tag 180◦ at a fixed point along three different axes in front
can be regarded as the interaction with computers.Therefore, we focus on tracking such typical 3D motion,where the brush is moving and rotating simultaneously by keeping the tip on the operating plane.Fig.4 presents a simple case when the user writes on the X-Y plane.Here,the whole motion of the brush is composed of a series of instantaneous postures at different Fig.2:Experiment deployment of tag rotation. time points.Each instantaneous posture can be expressed with the absolute position of its tip on the operating plane and the 3D orientation of the brush in 3D environment.Therefore the whole motion of the brush is decomposed as follows: moving the absolute position of the brush tip and rotating the orientation of the brush to the operating plane.For the first part,we only need to extract the relative moving trajectory .3 of the tip,which is the relative displacement of the absolute 020406080100120140160180 0 20406080100120140160180 Rotation(deg) Rotationdeg) position between consecutive time points,which called 2D (a)Phase change along with tag ori-(b)The measured phase when rotat- movement in next Section.For the second part,we need to entation ing along Y-axis extract the 3D orientation of the brush at each time point. Fig.3:Empirical study when rotating the tag. which will affect the relative trajectory estimation.Here,the 3D orientation refers to the relative orientation of the brush of an RFID reader,and present the phase change in Fig.3(a). to the operating plane. According to the results.we find that the rotation along Y-axis Linear shaped object will introduce a linear decrement on the phase measurement, while the rotation along X-axis or Z-axis almost does not Operating plane affect the phase measurements.It means the tag orientation along Y-axis will also introduce some offset to the actual phase measurement.which is defined as the orientation diversity of 2D Movement phase.Therefore,supposing the transmitting distance is dr, then the measured phase value a can be calculated as: X 0=(22m+9am+8,)mod2x, (1) Fig.4:Decomposing 3D motion into 3D rotation and 2D 入 movement. where A is the wavelength of RF signal,Odev denotes the Based on the understanding,we can express the instanta- device diversity,and 0 represents the orientation diversity. neous posture of the brush in the Cartesian Coordinate System Here,the device diversity Odee contains the phase diversity caused by both the reader and the tag.In RFID system,since as shown in Fig.4.X-Y plane represents the operating plane of the brush,and Z-axis is the perpendicular to the X-Y plane, the reader communicates with the tag via backscattering,the representing the height away from the X-Y plane.Therefore, signal traverses 2dr distance in total,which thus leads to 2d2m phase change. we can use(B,yB,0)to represent the absolute position of the brush tip.As for the brush orientation,we can use elevation Furthermore,we investigate the consistency of orientation angle and azimuthal angle to uniquely define the orientation diversity across different tags.Since the orientation diversity is Particularly,the elevation angle,denoted as B.,measures the only related to the rotation along Y-axis,we next rotate four angle between the brush and X-Y plane,which indicates slope different tags along Y-axis in turn at the same fixed point. Fig.3(b)presents the phase change along with the rotation of the brush with respect to the X-Y plane.The azimuthal along Y-axis.It is clear that all the four tags have the same angle,denoted as Ba,measures the angle between the brush's projection on the X-Y plane and the X-axis,which indicates phase trend along with the rotation angle.Therefore,we azimuth shift of the brush,rotating in anti-clockwise from the conclude that all the tag have the same orientation diversity X-axis.Therefore,when the brush is moving on the X-Y plane, of 0 if the rotation angle along Y-axis is the same. Be is ranging from 0 to /2,and Ba is ranging from 0 to IV.MODELING 3D MOTION VIA A LINEAR TAG ARRAY 2.Then,combining the position of the brush tip (B,yB,0) A.Decomposing 3D Motion into 3D Orientation and 2D and the brush orientation (Be,Ba),we can accurately describe the instantaneous posture of the brush.Thus,our goal is to Movement estimate the both brush orientation and the brush position to Before deeply investigating the relationship between the t track the 3D motion of the brush,which is solved via two 3D motion and RF signal of linear shaped object,we first models in the next two subsections. demonstrate how to decompose the 3D motion.Without loss of generality,we use the brush to replace the linear shaped object B.Modeling 3D Orientation via Linear Tag Array in this paper to demonstrate our model.Typically,the brush is We next demonstrate how to estimate the brush orientation, used to write or draw pictures on the operating plane,which i.e.,(Be,Ba),by leveraging the linear tag array.Fig.5 illustrates
Y Z X Fig. 2: Experiment deployment of tag rotation. 0 20 40 60 80 100 120 140 160 180 Rotation(deg.) -4 -3 -2 -1 0 1 2 3 4 Phase(rad.) X Y Z (a) Phase change along with tag orientation 0 20 40 60 80 100 120 140 160 180 Rotation(deg.) 0 1 2 3 4 5 6 Phase(rad.) Tag1 Tag2 Tag3 Tag4 (b) The measured phase when rotating along Y-axis Fig. 3: Empirical study when rotating the tag. of an RFID reader, and present the phase change in Fig.3(a). According to the results, we find that the rotation along Y-axis will introduce a linear decrement on the phase measurement, while the rotation along X-axis or Z-axis almost does not affect the phase measurements. It means the tag orientation along Y-axis will also introduce some offset to the actual phase measurement, which is defined as the orientation diversity of phase. Therefore, supposing the transmitting distance is dT , then the measured phase value θ can be calculated as: θ = (2dT λ 2π + θdev + θγ) mod 2π, (1) where λ is the wavelength of RF signal, θdev denotes the device diversity, and θγ represents the orientation diversity. Here, the device diversity θdev contains the phase diversity caused by both the reader and the tag. In RFID system, since the reader communicates with the tag via backscattering, the signal traverses 2dT distance in total, which thus leads to 2dT λ 2π phase change. Furthermore, we investigate the consistency of orientation diversity across different tags. Since the orientation diversity is only related to the rotation along Y-axis, we next rotate four different tags along Y-axis in turn at the same fixed point. Fig.3(b) presents the phase change along with the rotation along Y-axis. It is clear that all the four tags have the same phase trend along with the rotation angle. Therefore, we conclude that all the tag have the same orientation diversity of θγ, if the rotation angle along Y-axis is the same. IV. MODELING 3D MOTION VIA A LINEAR TAG ARRAY A. Decomposing 3D Motion into 3D Orientation and 2D Movement Before deeply investigating the relationship between the 3D motion and RF signal of linear shaped object, we first demonstrate how to decompose the 3D motion. Without loss of generality, we use the brush to replace the linear shaped object in this paper to demonstrate our model. Typically, the brush is used to write or draw pictures on the operating plane, which can be regarded as the interaction with computers. Therefore, we focus on tracking such typical 3D motion, where the brush is moving and rotating simultaneously by keeping the tip on the operating plane. Fig.4 presents a simple case when the user writes on the X-Y plane. Here, the whole motion of the brush is composed of a series of instantaneous postures at different time points. Each instantaneous posture can be expressed with the absolute position of its tip on the operating plane and the 3D orientation of the brush in 3D environment. Therefore, the whole motion of the brush is decomposed as follows: moving the absolute position of the brush tip and rotating the orientation of the brush to the operating plane. For the first part, we only need to extract the relative moving trajectory of the tip, which is the relative displacement of the absolute position between consecutive time points, which called 2D movement in next Section. For the second part, we need to extract the 3D orientation of the brush at each time point, which will affect the relative trajectory estimation. Here, the 3D orientation refers to the relative orientation of the brush to the operating plane. Z X 2D Movement Y 𝜷𝒂 Linear shaped object Operating plane Fig. 4: Decomposing 3D motion into 3D rotation and 2D movement. Based on the understanding, we can express the instantaneous posture of the brush in the Cartesian Coordinate System as shown in Fig.4. X-Y plane represents the operating plane of the brush, and Z-axis is the perpendicular to the X-Y plane, representing the height away from the X-Y plane. Therefore, we can use (xB, yB, 0) to represent the absolute position of the brush tip. As for the brush orientation, we can use elevation angle and azimuthal angle to uniquely define the orientation. Particularly, the elevation angle, denoted as βe, measures the angle between the brush and X-Y plane, which indicates slope of the brush with respect to the X-Y plane. The azimuthal angle, denoted as βa, measures the angle between the brush’s projection on the X-Y plane and the X-axis, which indicates azimuth shift of the brush, rotating in anti-clockwise from the X-axis. Therefore, when the brush is moving on the X-Y plane, βe is ranging from 0 to π/2, and βa is ranging from 0 to 2π. Then, combining the position of the brush tip (xB, yB, 0) and the brush orientation hβe, βai, we can accurately describe the instantaneous posture of the brush. Thus, our goal is to estimate the both brush orientation and the brush position to track the 3D motion of the brush, which is solved via two models in the next two subsections. B. Modeling 3D Orientation via Linear Tag Array We next demonstrate how to estimate the brush orientation, i.e., hβe, βai, by leveraging the linear tag array. Fig.5 illustrates
X Fig.5:Deployment of RF-Brush. the basic deployment of RF-brush.Particularly,we attach a Fig.6:Relationship between phase difference and brush linear tag array on the surface of the brush for passive sensing. projection. whose deployment is known in advance.For simplicity,we demonstrate our model with only two tags,i.e.,T and T2, △9,12= 201x-2.2元 (4) 入 and the reader can easily extend the model to the array with multiple tags.We use(1,y1,21)and (x2,y2,22)to represent where lir represents the projected distance on the X-axis of the coordinates of the two tags in our model.Here,the two L;as shown in Fig.6.In essence,the phase difference A0.1.2 here measures the distance of brush projection between T and tags Ti and T2 are l and l2 distance away from the brush tip, respectively.Besides,the antenna A and Ay are deployed T2 along X-axis. on the X-Y plane to measure the RF phase of tags from two 2)Estimating Brush Orientation from Brush Projection: perpendicular dimensions.Antenna Ar is on the Y-axis,which Next,we demonstrate how to estimate the brush orientation mainly measures phase change caused by tag displacement based on l1-2..In Fig.6,we can calculate the projection of along X-axis.And antenna A,is on the X-axis,and measures li from the brush orientation (Be,Ba)based on the geometric the phase change due to the displacement along Y-axis.Based position as: li,x=l3×cos Be×cos6a (5) on the deployment,our basic idea is to first calculate the phase difference between two tags at the same time point, Therefore,we further rewrite Eq.(4)with the brush orientation then estimate the brush projection on the X-Y plane from the (Be,Ba)as phase difference of the two tags,and finally leverage the brush (6) projection to calculate the brush orientation. △812=20-12)co×cos2m 1)Calculating Brush Projection from Phase Difference: Since l and l2 have been measured in advance,the unknown Firstly,we demonstrate how to refine the phase difference parameters are only the brush orientation (Be,Ba). between two tags,which can be further used to estimate the So far.we have modeled the brush orientation based on the brush projection.Consider Fig.6 where we use antenna Ar to phase difference received from the antenna A.However,Eg. measure the RF phase,and present the top view of Fig.5.Here, (6)still cannot uniquely express the brush orientation.Since Ba the RF signals from Az first reach T before T2.According ranges from 0 to 2m,cos Ba and cos(2-Ba)have the same to Eq.(1),the phase difference A0r.1.2 between Ti and T2 value.As a result,there is always an ambiguity orientation received by antenna A can be calculated as: with the same phase difference.To solve the ambiguity and △012=(21-22r+△0ae12+△0,12)m0d2x calculate the orientation,we deploy anther antenna A as 入 shown in Fig.6,which is mutually orthogonal manner with the (2) Here,Adev.1.2 calculates the phase difference caused by the antenna A.So the operating plane is the central area where device diversity.Since the device diversity is constant for all the beams of two antennas meets,and the antenna Ay measure the tags,we can remove them in advance.A0.1.2 represents the phase difference from another direction to compensate the the phase difference caused by the orientation diversity.Based antenna A..Then,we can follow Eq.(6)to write the equation on the key observation in Section III,this term is equal to 0 in based on the phase difference measured from antenna Ay as: this model,because the two tags have the same orientation and the same orientation diversity.Moreover,we simply remove △91,2= 201-2)×cos Be×sin2x (7) 入 the mod function,because the human writing is much slower Comparing Eq.(6)with Eg.(7),we have replaced cos Ba with than the sampling rate of RFID,which is later discussed in sin B,and thus we can unique describe the brush orientation Section V.Therefore,the phase difference A0z.1.2 in Eq. with the two equations.By combining Eq.(6)and Eq.(7).we (2)can be simplified as: can build a nonlinear equation set: △012=2-22m= 2△122m (3) [△0x1,2 T2l1-l2)xcos Be×>cos Ba2元 入 入 △9y,12J 244-2xcB.xsina2元 (8) where Az1.2 calculates the coordinate difference between T 入 and T2 along X-axis.By replacing the coordinate difference and the goal is to solve the brush orientation (Be,Ba).In Ax1.2 with the relative distance difference between tags and section V-C,we will demonstrate how to utilize the nonlinear brush tip along X-axis,we can further rewrite Eq.(3)as: least squares to estimate Be and Ba
Z Y X 𝑨𝒙 𝑻𝟏 𝑻𝟐 𝒍2 𝒍1 𝑨𝒚 Fig. 5: Deployment of RF-Brush. the basic deployment of RF-brush. Particularly, we attach a linear tag array on the surface of the brush for passive sensing, whose deployment is known in advance. For simplicity, we demonstrate our model with only two tags, i.e., T1 and T2, and the reader can easily extend the model to the array with multiple tags. We use (x1, y1, z1) and (x2, y2, z2) to represent the coordinates of the two tags in our model. Here, the two tags T1 and T2 are l1 and l2 distance away from the brush tip, respectively. Besides, the antenna Ax and Ay are deployed on the X-Y plane to measure the RF phase of tags from two perpendicular dimensions. Antenna Ax is on the Y-axis, which mainly measures phase change caused by tag displacement along X-axis. And antenna Ay is on the X-axis, and measures the phase change due to the displacement along Y-axis. Based on the deployment, our basic idea is to first calculate the phase difference between two tags at the same time point, then estimate the brush projection on the X-Y plane from the phase difference of the two tags, and finally leverage the brush projection to calculate the brush orientation. 1) Calculating Brush Projection from Phase Difference: Firstly, we demonstrate how to refine the phase difference between two tags, which can be further used to estimate the brush projection. Consider Fig.6 where we use antenna Ax to measure the RF phase, and present the top view of Fig.5. Here, the RF signals from Ax first reach T1 before T2. According to Eq. (1), the phase difference ∆θx,1,2 between T1 and T2 received by antenna Ax can be calculated as: ∆θx,1,2 = (2(x1 − x2) λ 2π + ∆θdev,1,2 + ∆θγ,1,2) mod 2π. (2) Here, ∆θdev,1,2 calculates the phase difference caused by the device diversity. Since the device diversity is constant for all the tags, we can remove them in advance. ∆θγ,1,2 represents the phase difference caused by the orientation diversity. Based on the key observation in Section III, this term is equal to 0 in this model, because the two tags have the same orientation and the same orientation diversity. Moreover, we simply remove the mod function, because the human writing is much slower than the sampling rate of RFID, which is later discussed in Section V. Therefore, the phase difference ∆θx,1,2 in Eq. (2)can be simplified as: ∆θx,1,2 = 2(x1 − x2) λ 2π = 2∆x1,2 λ 2π, (3) where ∆x1,2 calculates the coordinate difference between T1 and T2 along X-axis. By replacing the coordinate difference ∆x1,2 with the relative distance difference between tags and brush tip along X-axis, we can further rewrite Eq. (3) as: 𝜃𝑎 X Z 𝑨𝒙 𝜷𝒂 𝒍𝟐,𝒙 Y 𝑨𝒚 𝑻𝟏 𝑻𝟐 𝒍𝟐,𝒚 ∆𝒙𝟏,𝟐 𝒍𝟏,𝒙 Fig. 6: Relationship between phase difference and brush projection. ∆θx,1,2 = 2(l1,x − l2,x) λ 2π, (4) where li,x represents the projected distance on the X-axis of li as shown in Fig.6. In essence, the phase difference ∆θx,1,2 here measures the distance of brush projection between T1 and T2 along X-axis. 2) Estimating Brush Orientation from Brush Projection: Next, we demonstrate how to estimate the brush orientation based on l1,x−l2,x. In Fig.6, we can calculate the projection of li from the brush orientation hβe, βai based on the geometric position as: li,x = li × cos βe × cos βa. (5) Therefore, we further rewrite Eq. (4) with the brush orientation hβe, βai as: ∆θx,1,2 = 2(l1 − l2) × cos βe × cos βa λ 2π. (6) Since l1 and l2 have been measured in advance, the unknown parameters are only the brush orientation hβe, βai. So far, we have modeled the brush orientation based on the phase difference received from the antenna Ax. However, Eq. (6) still cannot uniquely express the brush orientation. Since βa ranges from 0 to 2π, cos βa and cos(2π − βa) have the same value. As a result, there is always an ambiguity orientation with the same phase difference. To solve the ambiguity and calculate the orientation, we deploy anther antenna Ay as shown in Fig.6, which is mutually orthogonal manner with the antenna Ax. So the operating plane is the central area where the beams of two antennas meets, and the antenna Ay measure the phase difference from another direction to compensate the antenna Ax. Then, we can follow Eq. (6) to write the equation based on the phase difference measured from antenna Ay as: ∆θy,1,2 = 2(l1 − l2) × cos βe × sin βa λ 2π. (7) Comparing Eq. (6) with Eq. (7), we have replaced cos βa with sin βa, and thus we can unique describe the brush orientation with the two equations. By combining Eq. (6) and Eq. (7), we can build a nonlinear equation set: ∆θx,1,2 ∆θy,1,2 = " 2(l1−l2)×cos βe×cos βa λ 2π 2(l1−l2)×cos βe×sin βa λ 2π # , (8) and the goal is to solve the brush orientation hβe, βai. In section V-C, we will demonstrate how to utilize the nonlinear least squares to estimate βe and βa
C.Modeling the 2D Movement via Linear Tag Array Next,we demonstrate how to calculate the 2D movement (x-,y-”0)T of the brush on the operating plane based on the estimated brush orientation.The basic idea is to ignore the absolute t- position of the brush tip of each instantaneous posture,and (t) estimate the relative displacement of the brush tip between the consecutive instantaneous postures.Since we are aware of the distance between each tag and the brush tip,we can thus △xB X estimate the relative displacement of the brush tip from the A relative displacement of each tag. Firstly,we calculate the relative displacement of each tag Fig.7:Modeling the displacement of brush's tip. based on the phase variation.Supposing the phase of tag result of each tag to improve the accuracy.By concatenating T;of the t-th instantaneous posture received by the antenna the displacement of brush tip between each consecutive instan- A is then the phase variation can be calculated taneous posture,we can recover the trajectory of the brush on according to Eq.(1)as: the X-Y plane. 409=09-9u-)三2△02r+a92,+Aa9 V.SYSTEM DESIGN T.i (9) A.System Overview where1)measures the displacement of RF-Brush is a 3D human-homputer interaction system, tag i of the t-th instantaneous posture along X-axis.Since the which tracks the 3D motion of tagged linear shaped object device diversityis constant all the time,the term based on the RFID technique.The basic idea is to deploy a is equal to.In regard to the term since it is proved linear tag array onto the object,and use a pair of antennas to be proportional to the orientation of the tag in section III,to track the 3D orientation and 2D movement of the linear we can measure 0 of each angle y in advance and cancel shaped object based on phase difference and phase variation. the term based on the estimated brush orientation (Be,Ba). Without loss of generality,we use the brush as the target By canceling the influence of bothandwe object to demonstrate our system.Fig.8 presents the system can thus deduce the tag displacement Ar from the phase framework of RF-brush.Particularly,the system first takes variation accordingly. the time-series RF phase of the linear tag array as input, Secondly,we further calculate the relative displacement of which is attached on the surface of brush.Then,the first brush tip from the tag displacement.Supposing in regard to module Data Preprocessing re-samples the time-series RF phase to resolve the random sampling in RFID system due the t-th instantaneous posture,the coordinate of the brush tip and tag T:are ()and ()respectively. to the Frame-Slotted-ALOHA protocol,and then removes the device diversity and periodicity of RF phase.After that,we Then as shown in Fig.7,we can build an equation as: use the 3D Orientation Estimation to estimate the orientation 9=x增+唱, (10)of the brush,which can resolve the orientation from phase where can be calculated based on Eq.(5).Thus,to difference between different tags.Next,the 2D Movement calculate the displacement of brush tip compared with the Tracking module calculates the displacement of the brush tip (-1)-th instantaneous posture,we can calculate as: based on the estimated brush orientation and phase variation △9=△8+9-) between same tag at consecutive time points.Finally,we can (11) revert the 3D motion of the brush based on the estimated 3D Therefore,we can deduce the displacement of the brush tip orientation and 2D movement. A based on the displacement of each tagx and the change of the brush projection As for the RF-signal displacement along Y-axis Au.we can follow Eq.(11)and Data Preprocessing 3D Orientation Estimating 2D Movement Tracking emove the periodicity Remove the orientation calculate it similarly.Finally,the displacement of the brush tip of Phas Diversity Data Interpolation on the X-Y plane can be represented as: Track the 2D Movemer t) △x 「(t-1)- ,(t) Calibrate Calibrate A」 △ -1)- Orientation Trajectory 4x(7 cosB(-1) (t) cosB(t-1) Fig.8:System framework. (t) t-1) △ sinBa sn6.0 cosB(t) B.Data Preprocessing (12) Data preprocessing is used to improve the reliability of In fact,based on the estimated brush orientation,we only need the RF phase by mitigating the noise in real environments one tag to estimate the 2D trajectory.Since we have attached and uniform the sampling rate in RFID system.Particular- a linear tag array on the brush,we can summary the tracking ly,in regard to the time-series RF phase of each tag,we
C. Modeling the 2D Movement via Linear Tag Array Next, we demonstrate how to calculate the 2D movement of the brush on the operating plane based on the estimated brush orientation. The basic idea is to ignore the absolute position of the brush tip of each instantaneous posture, and estimate the relative displacement of the brush tip between the consecutive instantaneous postures. Since we are aware of the distance between each tag and the brush tip, we can thus estimate the relative displacement of the brush tip from the relative displacement of each tag. Firstly, we calculate the relative displacement of each tag based on the phase variation. Supposing the phase of tag Ti of the t-th instantaneous posture received by the antenna Ax is θ (t) x,i, then the phase variation ∆θ (t) x,i can be calculated according to Eq. (1) as: ∆θ (t) x,i = θ (t) x,i − θ (t−1) x,i = 2∆x (t) i λ 2π + ∆θ (t) dev,i + ∆θ (t) γ,i, (9) where ∆x (t) i = x (t) i − x (t−1) i measures the displacement of tag i of the t-th instantaneous posture along X-axis. Since the device diversity θdev,i is constant all the time, the term ∆θ (t) dev,i is equal to 0. In regard to the term ∆θ (t) γ,i, since it is proved to be proportional to the orientation of the tag in section III, we can measure θγ of each angle γ in advance and cancel the term based on the estimated brush orientation hβe, βai. By canceling the influence of both ∆θ (t) γ,i and ∆θ (t) dev,i, we can thus deduce the tag displacement ∆x (t) i from the phase variation ∆θ (t) x,i accordingly. Secondly, we further calculate the relative displacement of brush tip from the tag displacement. Supposing in regard to the t-th instantaneous posture, the coordinate of the brush tip and tag Ti are (x (t) B , y (t) B , 0) and (x (t) i , y (t) i , z (t) i ), respectively. Then as shown in Fig.7, we can build an equation as: x (t) i = x (t) B + l (t) i,x, (10) where l (t) i,x can be calculated based on Eq. (5). Thus, to calculate the displacement of brush tip compared with the (t − 1)-th instantaneous posture, we can calculate ∆x (t) i as: ∆x (t) i = ∆x (t) B + l (t) i,x − l (t−1) i,x . (11) Therefore, we can deduce the displacement of the brush tip ∆x (t) B based on the displacement of each tag ∆x (t) i and the change of the brush projection l (t) i,x − l (t−1) i,x . As for the displacement along Y-axis ∆y (t) B , we can follow Eq. (11) and calculate it similarly. Finally, the displacement of the brush tip on the X-Y plane can be represented as: " ∆x (t) B ∆y (t) B # = " ∆x (t) i ∆y (t) i # + " l (t−1) i,x − l (t) i,x l (t−1) i,y − l (t) i,y# = " ∆x (t) i ∆y (t) i # + li × " cosβ(t−1) a , −cosβ(t) a sinβ(t−1) a , sinβ(t) a # × " cosβ(t−1) e cosβ(t) e # . (12) In fact, based on the estimated brush orientation, we only need one tag to estimate the 2D trajectory. Since we have attached a linear tag array on the brush, we can summary the tracking Z 𝑻𝒊 Y ∆𝒙𝑩 (𝒕) 𝑨𝒙 𝑨𝒚 ∆𝒚𝑩 (𝒕) (𝒙𝒊 (𝒕−𝟏) , 𝒚𝒊 (𝒕−𝟏) , 𝟎) ∆𝒙𝒊 (𝒕) (𝒙𝑩 (𝒕−𝟏) , 𝒚𝑩 (𝒕−𝟏) , 𝟎) 𝒍 𝒊,𝒙 (𝒕−𝟏) 𝒍 𝒊,𝒙 (𝒕) X Fig. 7: Modeling the displacement of brush’s tip. result of each tag to improve the accuracy. By concatenating the displacement of brush tip between each consecutive instantaneous posture, we can recover the trajectory of the brush on the X-Y plane. V. SYSTEM DESIGN A. System Overview RF-Brush is a 3D human-homputer interaction system, which tracks the 3D motion of tagged linear shaped object based on the RFID technique. The basic idea is to deploy a linear tag array onto the object, and use a pair of antennas to track the 3D orientation and 2D movement of the linear shaped object based on phase difference and phase variation. Without loss of generality, we use the brush as the target object to demonstrate our system. Fig.8 presents the system framework of RF-brush. Particularly, the system first takes the time-series RF phase of the linear tag array as input, which is attached on the surface of brush. Then, the first module Data Preprocessing re-samples the time-series RF phase to resolve the random sampling in RFID system due to the Frame-Slotted-ALOHA protocol, and then removes the device diversity and periodicity of RF phase. After that, we use the 3D Orientation Estimation to estimate the orientation of the brush, which can resolve the orientation from phase difference between different tags. Next, the 2D Movement Tracking module calculates the displacement of the brush tip based on the estimated brush orientation and phase variation between same tag at consecutive time points. Finally, we can revert the 3D motion of the brush based on the estimated 3D orientation and 2D movement. Data Preprocessing Data Interpolation Remove the Periodicity of Phase Variation 3D Orientation Estimating Remove the Periodicity of Phase Difference RF-signal [Orientation & Trajectory] Estimate the 3D Orientation Calibrate Abnormal Angle 2D Movement Tracking Remove the Orientation Diversity Track the 2D Movement Calibrate Abnormal Position Fig. 8: System framework. B. Data Preprocessing Data preprocessing is used to improve the reliability of the RF phase by mitigating the noise in real environments and uniform the sampling rate in RFID system. Particularly, in regard to the time-series RF phase of each tag, we
first stitch the phase values and remove 2 jump,which is a common phenomenon in phase measurement.Then,we Negative utilize the Kalman Filter to filter the corresponding noises in the phase values.After that,supposing we attach n tags on the brush,we re-sample the phase from two antennas of n tags into m snapshots by interpolating,where each Positive snapshot measures one instantaneous posture of the brush. ttt y(m) X We use 。。 to denote the phase Fig.9:Removing the periodicity of phase difference for Ay. 02) m Take the phase measurements in real environments as an measurements,where) example.Suppose the measured phase values for four tags denotes the phase (T1,T2,T3,Ta)are (1.37,4.8,0.24,2.54),respectively.Then ofTit-th snapshot.andrespctively represent the the phase differences are (-3.43,4.56,-2.3),respectively. phase from antennas A and Ay.In our system,we re-sample Here,two phase differences are negative and only one phase the time-series phase to 20Hz based on the reading rate in real difference is positive.Therefore,we need adjust the phase environments. differences to (-3.43,-1.72,-2.3)which actually is the C.3D Orientation Estimating correct result of phase difference. 2)Estimating the 3D Orientation based on Phase Differ- 1)Removing the Periodicity of Phase Difference:As men- ence:After removing the periodicity of phase difference, tioned in Eg.(7)of Section IV-B,the phase difference is we demonstrate how to calculate the brush orientation,i.e., determined by the distance between the adjacent tags.There- (Be,Ba)from the model in Section IV-B.Since we can deploy fore,to guarantee each phase difference can uniquely represent n tags on the brush to improve the robustness of RF-brush,we one brush orientation,we first set the distance between the can calculate always C groups of phase differences from tag pairs,where C means the counts of selections of 2 tags from adjacent tags,e.g.,l2-l1,to less than A/4,so that the all the n tags.Therefore,given the orientation (Be,Ba).we theoretical phase difference ranges from to However, can calculate the error function of phase difference E(Be,Ba) due to the signal noise in real phase measurements,it is still for each tag pair as: unreliable to directly use the raw phase difference.Because it is hard to determine the actual phase difference is A0v.1.2 E(Ae,Ba)=∑∑0a0.-△EP+lA0,-△0gf i=1j=i+1 or A0y.1.2 +2m,which is called the periodicity of phase (14) difference. where A.is the theoretical phase difference with respect To solve the periodicity of the phase difference,we make to the orientation (B Ba)based on Eq.(8).and use of the deployment of the linear tag array and leverage the represents the measured phase difference after periodicity consistent phase difference between adjacent tags.The basic removing based on Eq.(13).Therefore,our goal is to find idea is that the phase differences between adjacent tags are all the theoretical orientation,where the error function E(Be,Ba) positive or negative because the tag array is linearly deployed is minimized: on the brush.Figure 9 illustrates the whole concept of the (Be:Ba)*=arg min E(Be:Ba), idea. Be:Ba In regard to the antenna Ay,when the angle Ba is smaller st.{(△921,2,△0g.1,2,…,(△0z,n-1,n,△9g,n-1,n} than m,the distance between Ti and Ay is smaller than the (15) distance between Ti+1 and Ay,i.e.,ly.ly.+1sM, fluence such as multi-path effect and mutual interferences 0v.i-0v.i+1+2T,if S(0v.i-0v.i+1)sM, be abnormal estimated.To efficiently detect these abnormal it means S(0y.i-0v.+1)=1 and sM =-1,and so we need orientations,we leverage the fact that the maximum rotating to minus 27.Similarly,for the third case,we need to add 2. speed of human motion is usually limited by a constant speed
first stitch the phase values and remove 2π jump, which is a common phenomenon in phase measurement. Then, we utilize the Kalman Filter to filter the corresponding noises in the phase values. After that, supposing we attach n tags on the brush, we re-sample the phase from two antennas of n tags into m snapshots by interpolating, where each snapshot measures one instantaneous posture of the brush. We use ϑ = ϑ (1) 1 ϑ (2) 1 · · · ϑ (m) 1 · · · · · · · · · · · · ϑ (1) n ϑ (2) n · · · ϑ (m) n to denote the phase measurements, where ϑ (t) i = D θ (t) x,i, θ(t) y,iE denotes the phase of Ti in t-th snapshot. θ (t) x,i and θ (t) y,i respectively represent the phase from antennas Ax and Ay. In our system, we re-sample the time-series phase to 20Hz based on the reading rate in real environments. C. 3D Orientation Estimating 1) Removing the Periodicity of Phase Difference: As mentioned in Eq. (7) of Section IV-B, the phase difference is determined by the distance between the adjacent tags. Therefore, to guarantee each phase difference can uniquely represent one brush orientation, we first set the distance between the adjacent tags, e.g., l2 − l1, to less than λ/4, so that the theoretical phase difference ranges from −π to π. However, due to the signal noise in real phase measurements, it is still unreliable to directly use the raw phase difference. Because it is hard to determine the actual phase difference is ∆θy,1,2 or ∆θy,1,2 ± 2π, which is called the periodicity of phase difference. To solve the periodicity of the phase difference, we make use of the deployment of the linear tag array and leverage the consistent phase difference between adjacent tags. The basic idea is that the phase differences between adjacent tags are all positive or negative because the tag array is linearly deployed on the brush. Figure 9 illustrates the whole concept of the idea. In regard to the antenna Ay, when the angle βa is smaller than π, the distance between Ti and Ay is smaller than the distance between Ti+1 and Ay, i.e., ly,i−ly,i+1 sM, θy,i − θy,i+1 + 2π, if S(θy,i − θy,i+1) sM, it means S(θy,i − θy,i+1) = 1 and sM = −1, and so we need to minus 2π. Similarly, for the third case, we need to add 2π. Z X 𝑨𝒚 Negative Positive Y 𝑇4 𝑇3𝑇2 𝑇1 𝜷𝒂 Fig. 9: Removing the periodicity of phase difference for Ay. Take the phase measurements in real environments as an example. Suppose the measured phase values for four tags hT1, T2, T3, T4i are h1.37, 4.8, 0.24, 2.54i, respectively. Then the phase differences are h−3.43, 4.56, −2.3i, respectively. Here, two phase differences are negative and only one phase difference is positive. Therefore, we need adjust the phase differences to h−3.43, −1.72, −2.3i which actually is the correct result of phase difference. 2) Estimating the 3D Orientation based on Phase Difference: After removing the periodicity of phase difference, we demonstrate how to calculate the brush orientation, i.e., hβe, βai from the model in Section IV-B. Since we can deploy n tags on the brush to improve the robustness of RF-brush, we can calculate always C 2 n groups of phase differences from tag pairs, where C 2 n means the counts of selections of 2 tags from all the n tags. Therefore, given the orientation hβe, βai, we can calculate the error function of phase difference E(βe, βa) for each tag pair as: E(βe, βa) = Xn i=1 Xn j=i+1 (|∆θx,i,j − ∆ ˜θx,i,j | 2 + |∆θy,i,j − ∆ ˜θy,i,j | 2 ) (14) where ∆θx,i,j is the theoretical phase difference with respect to the orientation hβe, βai based on Eq. (8), and ∆ ˜θx,i,j represents the measured phase difference after periodicity removing based on Eq. (13). Therefore, our goal is to find the theoretical orientation, where the error function E(βe, βa) is minimized: (βe, βa) ∗ = arg min βe,βa E(βe, βa), s.t. n (∆ ˜θx,1,2, ∆ ˜θy,1,2), · · · ,(∆ ˜ θx,n−1,n, ∆ ˜ θy,n−1,n) o . (15) We utilize the nonlinear least squares to estimate hβe, βai, and take advantage of the Gauss-Newton method which is based on a linear approximation of the objective function E(βe, βa). We start with an initial approximation of the parameter vector hβe, βai and iteratively update the parameter vector until E(βe, βa) converges to a local minimum. The value of the last parameter vector is estimated results. Finally, we can get the brush orientation at every snapshot. 3) Calibrating the Abnormal Orientation: Due to the in- fluence such as multi-path effect and mutual interferences in the real environments, the estimated brush orientations may still deviate from the correct value some time. The main reason is that the human hand may change the multipath environment dynamically, so that some orientation may be abnormal estimated. To efficiently detect these abnormal orientations, we leverage the fact that the maximum rotating speed of human motion is usually limited by a constant speed
rmaz.Therefore,once the rotation angle between consecutive snapshots exceeds At x rmar,we can regard the orientation in the snapshot is abnormal.Here,At represents the time interval between consecutive snapshots. After detecting the abnormal orientations,we need to cal- ibrate the abnormal brush orientation.The basic idea is that y.0)y,0) the rotation speed of human rotation can be regarded as a 9y8,0) uniform motion,so that we can use the previous rotation (x0.y.0 speed tocalibrate the abnormal orientation.Suppose) and r-i Fig.10:Tracking the continuous movement of brush via linear represent the average speed of elevation angle tag array. Be and azimuthal angle Ba rotation,which can be calculated from previous snapshots.Then,the final orientation of the t-th 3)Calibrating the Abnormal Displacements:Similar to snapshot can be calculated as:B)=)1)x At and Section V-C3,the trajectory result is also affected by the B=Bt-1)+r-)×△t.Finally,we use the(6,B&) ambient noises.Therefore,we further detect the abnormal to replace the abnormal orientation. displacements and calibrate them based on the moving speed. D.2D Movement Tracking The basic idea is that the maximum writing speed of human 1)Removing the Orientation Diversity:According to the is usually limited by a constant speed vmaz Similarly,based preliminary studies in Section III,the RF phase of each tag is on the maximum moving speed Umaz,we can select the ab- affected by the orientation diversity,because the orientation of normal displacements if the distance of displacement exceeds the tag is unavoidably changed along with the rotation of the At x Umaz.After that,we can calibrate the displacement brush.To remove the orientation diversity,we first measure the according to the average moving speed )and)as: orientation diversity of each tags at different rotation angles in x8=x8-+-x△t and y调=yg-+g-)x△t. advance.Then,when we get the brush orientation according At last,we use the value of (to replace the corre- to the Section V-C.we can eliminate the orientation diversity sponding abnormal displacements for trajectory calculation by offsetting the preserved orientation diversity of each tag. VI.PERFORMANCE EVALUATION In this way,we can guarantee that the calibrated RF phase is only related to the distance to the antenna. A.Experimental Setup 2)Tracking the 2D Movement based on Phase Variation: In order to validate the performance of RF-Brush,we After we have removed the orientation diversity,we further conduct several experiments on 3D orientation estimation and demonstrate to track the trajectory based on the model in 2D movment tracking in realistic settings.The experimental Section IV-C.Particularly,Fig.10 presents the concept of setup of RF-Brush consists of a linear tag array of 4 AZ- tracking the trajectory of the brush based on two mutually 9629 RFID tags and an ImpinJ Speedway R420 RFID reader orthogonal antennas.Since we only focus on the relative integrated with two Laird S9028PCL RFID antennas.As the trajectory rather than absolute position of brush,the initial Figure 11 shows,the two antennas are separated 100cm away position of the trajectory can be any position.For simplicity, from the operating plane with the same height as the operating we choose to set the initial position of the brush tip as plane.The size of operating plane is about 25 x 25cm2.The the origin of coordinate.Then we can calculate the relative distance between two adjacent tags is 8cm,and the distance displacement of brush tip at each snapshot according to Eq. between first tag to the brush tip is 2cm.An optiTrack [19] (12).After that,we can recover the trajectory as: device is also deployed to collect the ground-truth of 3D (t) (0) () orientation and 2D movement. ,(t) t-1)十 t) △yB (0) (k) Metrics.In this paper,we use the elevation angle error and azimuthal angle error to evaluate the 3D orientation and use (16) where (is the initial position.ie.. the distance error to evaluate the 2D movement,which are the origin of the difference between the estimated value and the ground- coordinate.Finally,we can track both the 3D orientation and truth from optiTrack.Moreover,in regard to the hand writing 2D movement of tagged brush. letters,we also use LipiTk [20]to recognize the letters and In theory,based on only one tag.it is possible to calculate use accuracy rate to evaluate the performance. the relative displacement of the brush tip.However,due to the ambient noises in the real environments,it is inaccuracy to OptiTrack S250e Camera compute the trajectory simply based on one tag.Instead,we take advantage of the tag array deployment of RF-Brush,and RFID Antenna Pa use all the tags to produce a more precise trajectory.Since we have n tags attached on the brush,we can compute n groups value of (of the t-th snapshot.Then we can use the linear regression to combine all the n groups value to calculate the final 2D movement. Fig.11:The experimental setup
rmax. Therefore, once the rotation angle between consecutive snapshots exceeds ∆t×rmax, we can regard the orientation in the snapshot is abnormal. Here, ∆t represents the time interval between consecutive snapshots. After detecting the abnormal orientations, we need to calibrate the abnormal brush orientation. The basic idea is that the rotation speed of human rotation can be regarded as a uniform motion, so that we can use the previous rotation speed to calibrate the abnormal orientation. Suppose r (t−1) e and r (t−1) a represent the average speed of elevation angle βe and azimuthal angle βa rotation, which can be calculated from previous snapshots. Then, the final orientation of the t-th snapshot can be calculated as: β (t) e = β (t−1) e +r (t−1) e ×∆t and β (t) a = β (t−1) a + r (t−1) a × ∆t. Finally, we use the hβ (t) e , β(t) a i to replace the abnormal orientation. D. 2D Movement Tracking 1) Removing the Orientation Diversity: According to the preliminary studies in Section III, the RF phase of each tag is affected by the orientation diversity, because the orientation of the tag is unavoidably changed along with the rotation of the brush. To remove the orientation diversity, we first measure the orientation diversity of each tags at different rotation angles in advance. Then, when we get the brush orientation according to the Section V-C, we can eliminate the orientation diversity by offsetting the preserved orientation diversity of each tag. In this way, we can guarantee that the calibrated RF phase is only related to the distance to the antenna. 2) Tracking the 2D Movement based on Phase Variation: After we have removed the orientation diversity, we further demonstrate to track the trajectory based on the model in Section IV-C. Particularly, Fig.10 presents the concept of tracking the trajectory of the brush based on two mutually orthogonal antennas. Since we only focus on the relative trajectory rather than absolute position of brush, the initial position of the trajectory can be any position. For simplicity, we choose to set the initial position of the brush tip as the origin of coordinate. Then we can calculate the relative displacement of brush tip at each snapshot according to Eq. (12). After that, we can recover the trajectory as: " x (t) B y (t) B # = " x (t−1) B y (t−1) B # + " ∆x (t) B ∆y (t) B # = " x (0) B y (0) B # + Xt k=1 " ∆x (k) B ∆y (k) B # , (16) where (x (0) B , y (0) B ) is the initial position, i.e., the origin of coordinate. Finally, we can track both the 3D orientation and 2D movement of tagged brush. In theory, based on only one tag, it is possible to calculate the relative displacement of the brush tip. However, due to the ambient noises in the real environments, it is inaccuracy to compute the trajectory simply based on one tag. Instead, we take advantage of the tag array deployment of RF-Brush, and use all the tags to produce a more precise trajectory. Since we have n tags attached on the brush, we can compute n groups value of (x (t) B , y (t) B ) of the t-th snapshot. Then we can use the linear regression to combine all the n groups value to calculate the final 2D movement. 𝑨𝒙 (𝒙𝑩 𝑨𝒚 (𝟎) , 𝒚𝑩 (𝟎) , 𝟎) (𝒙𝑩 (𝟏) , 𝒚𝑩 (𝟏) , 𝟎) (𝒙𝑩 (𝟐) , 𝒚𝑩 (𝟐) , 𝟎) (𝒙𝑩 (𝟑) , 𝒚𝑩 (𝟑) , 𝟎) X Z Y Fig. 10: Tracking the continuous movement of brush via linear tag array. 3) Calibrating the Abnormal Displacements: Similar to Section V-C3, the trajectory result is also affected by the ambient noises. Therefore, we further detect the abnormal displacements and calibrate them based on the moving speed. The basic idea is that the maximum writing speed of human is usually limited by a constant speed vmax Similarly, based on the maximum moving speed vmax, we can select the abnormal displacements if the distance of displacement exceeds ∆t × vmax. After that, we can calibrate the displacement according to the average moving speed v (t−1) x and v (t−1) y as: x (t) B = x (t−1) B + v (t−1) x × ∆t and y (t) B = y (t−1) B + v (t−1) y × ∆t. At last, we use the value of hx (t) B , y (t) B i to replace the corresponding abnormal displacements for trajectory calculation. VI. PERFORMANCE EVALUATION A. Experimental Setup In order to validate the performance of RF-Brush, we conduct several experiments on 3D orientation estimation and 2D movment tracking in realistic settings. The experimental setup of RF-Brush consists of a linear tag array of 4 AZ- 9629 RFID tags and an ImpinJ Speedway R420 RFID reader integrated with two Laird S9028PCL RFID antennas. As the Figure 11 shows, the two antennas are separated 100cm away from the operating plane with the same height as the operating plane. The size of operating plane is about 25 × 25cm2 . The distance between two adjacent tags is 8cm, and the distance between first tag to the brush tip is 2cm. An optiTrack [19] device is also deployed to collect the ground-truth of 3D orientation and 2D movement. Metrics. In this paper, we use the elevation angle error and azimuthal angle error to evaluate the 3D orientation and use the distance error to evaluate the 2D movement, which are the difference between the estimated value and the groundtruth from optiTrack. Moreover, in regard to the hand writing letters, we also use LipiTk [20] to recognize the letters and use accuracy rate to evaluate the performance. RFID Antenna Pair OptiTrack S250e Camera Operating Plane Tag Array Fig. 11: The experimental setup
20 20 15 0.8 0.8 0.6 0.6 0. 0.4 0.2 060 20 2a0 0 0-30°30-60°60-90 240 0-3cm3-6cm6-9cm abcdefghi jk Imopqrstuwxyz Elevation angle(deg.) Azimuthal angledeg.) Distance(cm.) Letters (a)The elevation angle error in sta-(b)The azimuthal angle error in (c)The 2D distance error between two (d)The recognition accuracy rate of tionary state. stationary state. time points. letters. 15 ne-can 0-candidate 2 abcdefghi jkImmopqrstuvxyz abcde fghi jk Imnopgrstuwxyz abcde fghi jk UserID Letters Letters Letters (e)The recognition accuracy rate of (f)The elevation angle error in con-(g)The azimuthal angle error in con-(h)The 2D distance error in continuous different users. tinuous movement. tinuous movement movement. Fig.12:Performance evaluation results. B.Performance of 3D Orientation and 2D Movement in plane during the writing process.And letter trajectories were Discrete States further recognized based on LipiTk. RF-Brush can accurately estimate the 3D orientation and As for the accuracy of letter recognition,we select the 2D displacement in discrete states with less than 10 angle largest confidence of candidate produced by LipiTk as our error and less than 1cm distance error.respectively.To recognition result.Fig.12(d)presents the recognition accuracy evaluate the elevation and azimuthal angle error,we fixed the rate of each letter.RF-Brush achieves an average recognition brush tip and rotated Be from0°to90°stepped byπ/l2, accuracy rate of 89%.Particularly,the recognition accuracy and rotated Ba from 0 to 2m stepped by /18.Hence,we rate of different letters almost exceeds 80%,except letter the have tested 252 different orientations in total.Fig.12(a)and “o.The reason is that“o”is easily recognized as“c”,when 12(b)shows the 3D orientation error with different brush the circle is not perfected drawn.Moreover,15 out of 26 letters orientations.Particularly,we separate the orientations into achieve more than 90%accuracy rate,and 21 out 26 characters several groups and plot the corresponding errors,respectively. are correctly recognized with a higher than 85%.It's worth As the Fig.12(a)and 12(b)shows,the RF-Brush has an average noting that the letters“f,k”,"p”,"r”have 10o%accuracy angle error about 6.4 and 7.8 respectively for Be and Ba. rate due to their distinct shapes. In particular,the different orientations have the similar angle Additionally,we compare the recognition accuracy across error.All the orientation errors are less than 10,which proofs different users to evaluate the robustness of RF-Brush.Here, the accuracy of RF-Brush. we use the first two largest confidence of candidates produced To evaluate the displacement error,we move the tagged by LipiTk as recognition result for comparing.As shown in brush from a start point to an end point with no change of Fig.12(e),all the users achieve more than 87%accuracy rate the brush orientation.Then,we calculate the displacement for the first candidate which validates the robustness of RF- between the two discrete states of brush.We randomly se- Brush.And the average accuracy increases to more than 95% lected 300 samples with different displacement distances,i.e for the first two candidates.Particularly,the highest accuracy ,0~3cm,3~6cm and 6~9cm for performance evaluation. rate of user 4 exceeds 96%with the first two candidates,and As Fig.12(c)shows,the average distance errors are 0.19cm, the lowest accuracy rate of user 2 is still about 93%.The 0.36cm and 0.54cm,respectively.It is extremely accurate in difference of recognition accuracy is mainly due to the writing the displacement calculation,because the brush orientation habit of each user.Overall,RF-Brush is sufficiently robust to keeps unchanged and the user also keeps still. recognize the letters of different users. C.3D Motion Tracking of Letters Further,we compare the trace and orientation calculated by RF-Brush can accurately track and recognize the handwrit- RF-Brush with OptiTrack.We present the angle error of ele- ing letters with an average recognition accuracy of 89%.We vation and azimuthal between the estimated angle and ground- involved 4 participants in total (2 males and 2 females)to truth in Fig.12(f)and 12(g).We find that the average elevation write all the 26 letters 10 times with the casual rotation in 3D angle error is5.7°,and the average azimuthal angle is8.6°.The orientation.We utilized the OptiTrack to collect the ground- orientation error of real hand writing is comparable with the truth of 3D orientation and 2D movement in the operating orientation error of discrete state.because we can remove the
0-30° 30-60° 60-90° Elevation angle(deg.) 0 5 10 15 20 Elevation angle error(deg.) (a) The elevation angle error in stationary state. 0-60° 60-120° 120-180° 180-240° 240-300° 300-360° Azimuthal angle(deg.) 0 5 10 15 20 Azimuthal angle error(deg.) (b) The azimuthal angle error in stationary state. 0-3cm 3-6cm 6-9cm Distance(cm.) 0 0.2 0.4 0.6 0.8 1 Distance error(cm.) (c) The 2D distance error between two time points. abcde f gh i j k lmnopq r s t uvwxyz Letters 0 0.2 0.4 0.6 0.8 1 Accuracy (d) The recognition accuracy rate of letters. 1 2 3 4 UserID 0 0.2 0.4 0.6 0.8 1 Accuracy One-candidate Two-candidates (e) The recognition accuracy rate of different users. abcde f gh i j k lmnopq r s t uvwxyz Letters 0 5 10 15 20 Elevation angle error(deg.) (f) The elevation angle error in continuous movement. abcde f gh i j k lmnopq r s t uvwxyz Letters 0 5 10 15 20 Azimuthal angle error(deg.) (g) The azimuthal angle error in continuous movement . abcde f gh i j k lmnopq r s t uvwxyz Letters 0 5 10 15 20 Distance error(cm.) (h) The 2D distance error in continuous movement. Fig. 12: Performance evaluation results. B. Performance of 3D Orientation and 2D Movement in Discrete States RF-Brush can accurately estimate the 3D orientation and 2D displacement in discrete states with less than 10◦ angle error and less than 1cm distance error, respectively. To evaluate the elevation and azimuthal angle error, we fixed the brush tip and rotated βe from 0 ◦ to 90◦ stepped by π/12, and rotated βa from 0 to 2π stepped by π/18. Hence, we have tested 252 different orientations in total. Fig.12(a) and 12(b) shows the 3D orientation error with different brush orientations. Particularly, we separate the orientations into several groups and plot the corresponding errors, respectively. As the Fig.12(a) and 12(b) shows, the RF-Brush has an average angle error about 6.4 ◦ and 7.8 ◦ respectively for βe and βa. In particular, the different orientations have the similar angle error. All the orientation errors are less than 10◦ , which proofs the accuracy of RF-Brush. To evaluate the displacement error, we move the tagged brush from a start point to an end point with no change of the brush orientation. Then, we calculate the displacement between the two discrete states of brush. We randomly selected 300 samples with different displacement distances, i.e ,0 ∼ 3cm, 3 ∼ 6cm and 6 ∼ 9cm for performance evaluation. As Fig.12(c) shows, the average distance errors are 0.19cm, 0.36cm and 0.54cm, respectively. It is extremely accurate in the displacement calculation, because the brush orientation keeps unchanged and the user also keeps still. C. 3D Motion Tracking of Letters RF-Brush can accurately track and recognize the handwriting letters with an average recognition accuracy of 89%. We involved 4 participants in total (2 males and 2 females) to write all the 26 letters 10 times with the casual rotation in 3D orientation. We utilized the OptiTrack to collect the groundtruth of 3D orientation and 2D movement in the operating plane during the writing process. And letter trajectories were further recognized based on LipiTk. As for the accuracy of letter recognition, we select the largest confidence of candidate produced by LipiTk as our recognition result. Fig.12(d) presents the recognition accuracy rate of each letter. RF-Brush achieves an average recognition accuracy rate of 89%. Particularly, the recognition accuracy rate of different letters almost exceeds 80%, except letter the “o”. The reason is that “o” is easily recognized as “c”, when the circle is not perfected drawn. Moreover, 15 out of 26 letters achieve more than 90% accuracy rate, and 21 out 26 characters are correctly recognized with a higher than 85%. It’s worth noting that the letters “f”, “k”, “p”, “r” have 100% accuracy rate due to their distinct shapes. Additionally, we compare the recognition accuracy across different users to evaluate the robustness of RF-Brush. Here, we use the first two largest confidence of candidates produced by LipiTk as recognition result for comparing. As shown in Fig.12(e), all the users achieve more than 87% accuracy rate for the first candidate which validates the robustness of RFBrush. And the average accuracy increases to more than 95% for the first two candidates. Particularly, the highest accuracy rate of user 4 exceeds 96% with the first two candidates, and the lowest accuracy rate of user 2 is still about 93%. The difference of recognition accuracy is mainly due to the writing habit of each user. Overall, RF-Brush is sufficiently robust to recognize the letters of different users. Further, we compare the trace and orientation calculated by RF-Brush with OptiTrack. We present the angle error of elevation and azimuthal between the estimated angle and groundtruth in Fig.12(f) and 12(g). We find that the average elevation angle error is 5.7 ◦ , and the average azimuthal angle is 8.6 ◦ .The orientation error of real hand writing is comparable with the orientation error of discrete state, because we can remove the
orientation and 2D movement,we can recover the 3D motion e of the cuboid for 3D human-computer interaction 0.e 0.6 VIII.CONCLUSION 8 0. 0 In this paper,we propose RF-Brush,a device-free system to 0.2 -Elevation angle 02 -X-axis provide 3D human-computer interaction.Our key innovations -Azimuthal angle -Y-axis lie in modeling the relationship between the RF-signal and 3D 5 1015 20 25 30 5 10 1520 orientation as well as 2D movement of tagged object.Based Angle error(deg.) Distance error(cm) on the model,we implemented a prototype system of RF- (a)The 3D orientation error. (b)The 2D movement error. Brush and examined its performance in real environment.The Fig.13:The CDF results. experimental results confirm the effectiveness of RF-Brush on abnormal angles according to Section V-C3.Moreover,we use both 3D orientation and 2D movement,and RF-Brush achieves the Euclidean distance between the estimated trajectory and over 89%accuracy rate for handwriting recognition. ground-truth at each time point to examine the distance error ACKNOWLEDGMENTS as shown in Fig.12(h).And the average distance error increases to 3.8cm and 4.2cm for X-axis and Y-axis,respectively,which This work is supported in part by National Natural Sci- is larger than the error of discrete states.We think it is ence Foundation of China under Grant Nos.61472185. mainly caused by the cumulative error during the estimation 61321491;JiangSu Natural Science Foundation under Grant of 2D movement.Overall.RF-Brush is still able to accurately Nos.BK20151390.This work is partially supported by Col- provide the 3D orientation and 2D movement with small error. laborative Innovation Center of Novel Software Technology Finally,we present the overall 3D motion tracking perfor- and Industrialization.This work is partially supported by mance of RF-Brush during the real hand writing by leveraging the program A for Outstanding PhD candidate of Nanjing the Cumulative Distribution Function(CDF).Fig.13 plots the University.Lei Xie is the corresponding author. CDF of the orientation error and the 2D movement error, REFERENCES respectively.For the 3D orientation error,80%of both the [1]S.Hinterstoisser,V.Lepetit,S.Ilic,P.Fua,and N.Navab,"Dominant elevation and azimuthal error are controlled within 12,while orientation templates for real-time detection of texture-less objects."in Proc.of IEEE CVPR.2010. the elevation angle error is less than 9o for the 80%test [2]W.Wan,F.Lu,Z.Wu,and K.Harada,Teaching robots to do object cases.For the 2D movement error,85%of the distance error assembly using multi-modal 3d vision,"Neurocomputing,vol.259,2017 along both X-axis and Y-axis are less than 10cm,while the [3]A.Collet,D.Berenson,S.S.Srinivasa,and D.Ferguson,"Object recognition and full pose registration from a single image for robotic distance error along X-axis is less than 10cm for 90%test manipulation,"in Proc.of IEEE ICRA,2009. cases.Therefore,RF-Brush can accurately track both the 3D [4]Q.Pu,S.Gupta.S.Gollakota,and S.Patel,"Whole-home gesture orientation and the 2D movement of brush of real hand writing. recognition using wireless signals,"in Proc.of ACM Mobicom.2013. [5]K.Ali,A.X.Liu,W.Wang,and M.Shahzad,"Keystroke recognition VII.DISCUSSION using wifi signals,"in Proc.of ACM Mobicom.2015 [6]NintendoWii,https://www.nintendo.com/. Off-plane Detection:For the applications,where the ma- [7]J.Wang,D.Vasisht,and D.Katabi,"Rf-idraw:virtual touch screen in the air using rf signals,"in Proc.of ACM SIGCOMM,2014. nipulation is sensitive to height,it is not enough if we only [8]L.Shangguan,Z.Zhou,and K.Jamieson,"Enabling gesture-based have the 3D orientation and 2D movement information.We interactions with objects,"in Proc.of ACM MobiSys,2017. also need the off-plane detection,which can further recognize [9]L.Shangguan,Z.Yang,A.X.Liu,Z.Zhou,and Y.Liu,"Relative the height change of the object,e.g.,lifting the brush and localization of rfid tags using spatial-temporal phase profiling."in Proc. of NSDI.2015. putting down the brush.The basic idea is that when we lift or [10]C.Duan.X.Rao,L.Yang.and Y.Liu,"Fusing rfid and computer vision put down the brush,the angle between the brush-antenna line for fine-grained object tracking."in Proc.of IEEE INFOCOM,2017. [11]W.Ruan,L.Yao,Q.Z.Sheng.N.J.Falkner,and X.Li,"Tagtrack and the operating plane increases or decreases accordingly.On Device-free localization and tracking using passive rfid tags,"in Proc. the contrary,when the brush is moving on the operating plane. of ACM MOBIOUITOUS.2014 the angle is almost unchanged,due to the stable height of the [12]J.Wang and D.Katabi,"Dude.where's my card?:Rfid positioning that works with multipath and non-line of sight."in Proc.of ACM brush.Generally,such angle which can be easily calculated SIGCOMM,2013. from the phase difference between any tag pairs.Therefore, [13]L.Yang.Y.Chen,X.-Y.Li.C.Xiao.M.Li,and Y.Liu,"Tagoram:Real- the off-plane detection can be realized based on the jump of time tracking of mobile rfid tags to high precision using cots devices," in Proc.of ACM Mobicom.2014. the phase difference between the tag pairs of the tag array. [14]R.Krigslund,P.Popovski,G.F.Pedersen,and K.Bank,"Potential of Extending Beyond Linear Shaped Object:In our system, rfid systems to detect object orientation,"in Proc.of IEEE ICC,2011. we can convert a linear shaped object to an intelligent device. [15]J.Liu,M.Chen,S.Chen,Q.Pan,and L.Chen,"Tag-compass: Determining the spatial direction of an object with small dimensions.' To extend the linear shaped objects to any arbitrary polyhedron in Proc.of IEEE INFOCOM.2017. such as a cuboid,we can attach multiple linear tag arrays on [16]T.Wei and X.Zhang."Gyro in the air:tracking 3d orientation of two adjacent sides of the cuboid.Particularly,we firstly acquire batteryless intemet-of-things,"in Proc.of ACM Mobicom,2016. [17]Q.Lin,L.Yang,Y.Sun,T.Liu,X.-Y.Li,and Y.Liu,"Beyond one- the elevation and azimuthal angles of the two sides based on dollar mouse:A battery-free device for 3d human-computer interaction the phase difference,and use the orientations of the adjacent via rfid tags,"in Proc.of IEEE INFOCOM,2015. sides to determine the 3D orientation of cuboid.Then.the 2D [18]L.Shangguan,Z.Li,Z.Yang,M.Li,and Y.Liu,"Otrack:Order tracking for luggage in mobile rfid systems,"in Proc.of IEEE INFOCOM,2013. movement of cuboid can easily estimated by phase variation, [19]OptiTrack,http://www.optitrack.com/. similar to the linear shaped object.Combining both the 3D [20]LipiTK,http://ipitk.sourceforge.net/
0 5 10 15 20 25 30 Angle error(deg.) 0 0.2 0.4 0.6 0.8 1 CDF Elevation angle Azimuthal angle (a) The 3D orientation error. 0 5 10 15 20 25 Distance error(cm.) 0 0.2 0.4 0.6 0.8 1 CDF X-axis Y-axis (b) The 2D movement error. Fig. 13: The CDF results. abnormal angles according to Section V-C3. Moreover, we use the Euclidean distance between the estimated trajectory and ground-truth at each time point to examine the distance error as shown in Fig.12(h). And the average distance error increases to 3.8cm and 4.2cm for X-axis and Y-axis, respectively, which is larger than the error of discrete states. We think it is mainly caused by the cumulative error during the estimation of 2D movement. Overall, RF-Brush is still able to accurately provide the 3D orientation and 2D movement with small error. Finally, we present the overall 3D motion tracking performance of RF-Brush during the real hand writing by leveraging the Cumulative Distribution Function(CDF). Fig.13 plots the CDF of the orientation error and the 2D movement error, respectively. For the 3D orientation error, 80% of both the elevation and azimuthal error are controlled within 12◦ , while the elevation angle error is less than 9 ◦ for the 80% test cases. For the 2D movement error, 85% of the distance error along both X-axis and Y-axis are less than 10cm, while the distance error along X-axis is less than 10cm for 90% test cases. Therefore, RF-Brush can accurately track both the 3D orientation and the 2D movement of brush of real hand writing. VII. DISCUSSION Off-plane Detection: For the applications, where the manipulation is sensitive to height, it is not enough if we only have the 3D orientation and 2D movement information. We also need the off-plane detection, which can further recognize the height change of the object, e.g., lifting the brush and putting down the brush. The basic idea is that when we lift or put down the brush, the angle between the brush-antenna line and the operating plane increases or decreases accordingly. On the contrary, when the brush is moving on the operating plane, the angle is almost unchanged, due to the stable height of the brush. Generally, such angle which can be easily calculated from the phase difference between any tag pairs. Therefore, the off-plane detection can be realized based on the jump of the phase difference between the tag pairs of the tag array. Extending Beyond Linear Shaped Object: In our system, we can convert a linear shaped object to an intelligent device. To extend the linear shaped objects to any arbitrary polyhedron such as a cuboid, we can attach multiple linear tag arrays on two adjacent sides of the cuboid. Particularly, we firstly acquire the elevation and azimuthal angles of the two sides based on the phase difference, and use the orientations of the adjacent sides to determine the 3D orientation of cuboid. Then, the 2D movement of cuboid can easily estimated by phase variation, similar to the linear shaped object. Combining both the 3D orientation and 2D movement, we can recover the 3D motion of the cuboid for 3D human-computer interaction. VIII. CONCLUSION In this paper, we propose RF-Brush, a device-free system to provide 3D human-computer interaction. Our key innovations lie in modeling the relationship between the RF-signal and 3D orientation as well as 2D movement of tagged object. Based on the model, we implemented a prototype system of RFBrush and examined its performance in real environment. The experimental results confirm the effectiveness of RF-Brush on both 3D orientation and 2D movement, and RF-Brush achieves over 89% accuracy rate for handwriting recognition. ACKNOWLEDGMENTS This work is supported in part by National Natural Science Foundation of China under Grant Nos. 61472185, 61321491; JiangSu Natural Science Foundation under Grant Nos. BK20151390. This work is partially supported by Collaborative Innovation Center of Novel Software Technology and Industrialization. This work is partially supported by the program A for Outstanding PhD candidate of Nanjing University. Lei Xie is the corresponding author. REFERENCES [1] S. Hinterstoisser, V. Lepetit, S. Ilic, P. Fua, and N. Navab, “Dominant orientation templates for real-time detection of texture-less objects,” in Proc. of IEEE CVPR, 2010. [2] W. Wan, F. Lu, Z. Wu, and K. Harada, “Teaching robots to do object assembly using multi-modal 3d vision,” Neurocomputing, vol. 259, 2017. [3] A. Collet, D. Berenson, S. S. Srinivasa, and D. Ferguson, “Object recognition and full pose registration from a single image for robotic manipulation,” in Proc. of IEEE ICRA, 2009. [4] Q. Pu, S. Gupta, S. Gollakota, and S. Patel, “Whole-home gesture recognition using wireless signals,” in Proc. of ACM Mobicom, 2013. [5] K. Ali, A. X. Liu, W. Wang, and M. Shahzad, “Keystroke recognition using wifi signals,” in Proc. of ACM Mobicom, 2015. [6] NintendoWii, https://www.nintendo.com/. [7] J. Wang, D. Vasisht, and D. Katabi, “Rf-idraw: virtual touch screen in the air using rf signals,” in Proc. of ACM SIGCOMM, 2014. [8] L. Shangguan, Z. Zhou, and K. Jamieson, “Enabling gesture-based interactions with objects,” in Proc. of ACM MobiSys, 2017. [9] L. Shangguan, Z. Yang, A. X. Liu, Z. Zhou, and Y. Liu, “Relative localization of rfid tags using spatial-temporal phase profiling.” in Proc. of NSDI, 2015. [10] C. Duan, X. Rao, L. Yang, and Y. Liu, “Fusing rfid and computer vision for fine-grained object tracking,” in Proc. of IEEE INFOCOM, 2017. [11] W. Ruan, L. Yao, Q. Z. Sheng, N. J. Falkner, and X. Li, “Tagtrack: Device-free localization and tracking using passive rfid tags,” in Proc. of ACM MOBIQUITOUS, 2014. [12] J. Wang and D. Katabi, “Dude, where’s my card?: Rfid positioning that works with multipath and non-line of sight,” in Proc. of ACM SIGCOMM, 2013. [13] L. Yang, Y. Chen, X.-Y. Li, C. Xiao, M. Li, and Y. Liu, “Tagoram: Realtime tracking of mobile rfid tags to high precision using cots devices,” in Proc. of ACM Mobicom, 2014. [14] R. Krigslund, P. Popovski, G. F. Pedersen, and K. Bank, “Potential of rfid systems to detect object orientation,” in Proc. of IEEE ICC, 2011. [15] J. Liu, M. Chen, S. Chen, Q. Pan, and L. Chen, “Tag-compass: Determining the spatial direction of an object with small dimensions,” in Proc. of IEEE INFOCOM, 2017. [16] T. Wei and X. Zhang, “Gyro in the air: tracking 3d orientation of batteryless internet-of-things,” in Proc. of ACM Mobicom, 2016. [17] Q. Lin, L. Yang, Y. Sun, T. Liu, X.-Y. Li, and Y. Liu, “Beyond onedollar mouse: A battery-free device for 3d human-computer interaction via rfid tags,” in Proc. of IEEE INFOCOM, 2015. [18] L. Shangguan, Z. Li, Z. Yang, M. Li, and Y. Liu, “Otrack: Order tracking for luggage in mobile rfid systems,” in Proc. of IEEE INFOCOM, 2013. [19] OptiTrack, http://www.optitrack.com/. [20] LipiTK, http://lipitk.sourceforge.net/