正在加载图片...
29:12 Y.Yin et al. Gravity (a)Walking (b)Lifting up the phone (c)Rotating the phone Fig.9.Analysis of activity features. example,if the user has laid down the arm,he/her will probably not lay down the arm again.For body level,the user keeps still or moving (e.g.,walking,jogging),it means he/she stays in the same state.Therefore,there is no self-cycle activities in body level.For arm level and wrist level, the short pause between two activities is used for segmenting two consecutive activities,as de- scribed in Section 4.1.3.The transfer relationship of states is shown in Figure 8.By maintaining the activity state machine,we can determine the activity state progressively and reduce the error of activity recognition.For example,we may wrongly recognize the activity "Rotating the phone" as "Fine-tuning".Nevertheless,the two states have similar energy-saving strategies,there will not be a sudden decrease of user experience.Usually,we will not recognize "Rotating the phone"as "Walking,"which turns off the screen for energy saving and leads to a bad user experience.There- fore,the activity state machine can control the recognition error in a tolerable range and guarantee a good user experience. For different activities,the amplitude and direction of linear-acc changes differently,as shown in Figure 9.For body level,the linear-acc changes periodically,as shown in Figure 9(a).For arm level,the user usually lifts his/her arm one time,as shown in Figure 9(b).While considering dif- ferent holding gestures of the phone,the direction variation of linear-acc cannot be mapped to the activities of lifting up or laying down the arm directly.Thus,we introduce the gravity data,which reflects the direction of gravity to assist for activity recognition.In regard to the wrist level,the user tends to rotate the phone to adjust the camera view,as shown in Figure 9(c).Therefore,we combine the linear-acc and gyroscope data to recognize the activities. With the activity segment from Section 4.1.3,we show how to classify an activity into one of the three levels in Figure 10.We utilize the variance of linear-acc,gyroscope data and the periodicity of sensor data to distinguish body level,arm level and wrist level.For body level,if the user stays motionless,the value of linear-acc and gyroscope data is close to zero.The rectangle B in Figure 6 and the rectangle C1 in Figure 7 represent the linear-acc and gyroscope data of pause (or motionlessness),respectively.If the sensor data in a relative long time (e.g.,larger than 15s) satisfies Equations(3)and(4),then the activity will be recognized as motionlessness,that is,be classified into body level In regard to body movement,it does not satisfy Equations(3)and(4). However,we can utilize the periodicity to distinguish body movement from other activities,as the sensor data of "Walking"shown in Figure 6.Here,we simply use the period tp of the activity for body movement detection.We usepp to represent the period of last k activities in body movement,respectively.If tp satisfy Equation(5),then the activity will be classified into body movement,that is,body level.Based on extensive experiments,we set ep to 0.2: p一走1p <Ep (5) ACM Transactions on Sensor Networks,Vol 13.No.4,Article 29.Publication date:September 201729:12 Y. Yin et al. Fig. 9. Analysis of activity features. example, if the user has laid down the arm, he/her will probably not lay down the arm again. For body level, the user keeps still or moving (e.g., walking, jogging), it means he/she stays in the same state. Therefore, there is no self-cycle activities in body level. For arm level and wrist level, the short pause between two activities is used for segmenting two consecutive activities, as de￾scribed in Section 4.1.3. The transfer relationship of states is shown in Figure 8. By maintaining the activity state machine, we can determine the activity state progressively and reduce the error of activity recognition. For example, we may wrongly recognize the activity “Rotating the phone” as “Fine-tuning”. Nevertheless, the two states have similar energy-saving strategies, there will not be a sudden decrease of user experience. Usually, we will not recognize “Rotating the phone” as “Walking,” which turns off the screen for energy saving and leads to a bad user experience. There￾fore, the activity state machine can control the recognition error in a tolerable range and guarantee a good user experience. For different activities, the amplitude and direction of linear-acc changes differently, as shown in Figure 9. For body level, the linear-acc changes periodically, as shown in Figure 9(a). For arm level, the user usually lifts his/her arm one time, as shown in Figure 9(b). While considering dif￾ferent holding gestures of the phone, the direction variation of linear-acc cannot be mapped to the activities of lifting up or laying down the arm directly. Thus, we introduce the gravity data, which reflects the direction of gravity to assist for activity recognition. In regard to the wrist level, the user tends to rotate the phone to adjust the camera view, as shown in Figure 9(c). Therefore, we combine the linear-acc and gyroscope data to recognize the activities. With the activity segment from Section 4.1.3, we show how to classify an activity into one of the three levels in Figure 10. We utilize the variance of linear-acc, gyroscope data and the periodicity of sensor data to distinguish body level, arm level and wrist level. For body level, if the user stays motionless, the value of linear-acc and gyroscope data is close to zero. The rectangle B in Figure 6 and the rectangle C1 in Figure 7 represent the linear-acc and gyroscope data of pause (or motionlessness), respectively. If the sensor data in a relative long time (e.g., larger than 15s) satisfies Equations (3) and (4), then the activity will be recognized as motionlessness, that is, be classified into body level. In regard to body movement, it does not satisfy Equations (3) and (4). However, we can utilize the periodicity to distinguish body movement from other activities, as the sensor data of “Walking” shown in Figure 6. Here, we simply use the period tp of the activity for body movement detection. We use tp1 , tp2 , ... , tpk−1 , tpk to represent the period of last k activities in body movement, respectively. If tp satisfy Equation (5), then the activity will be classified into body movement, that is, body level. Based on extensive experiments, we set ϵp to 0.2: tp − 1 k k j=1 tpk 1 k k j=1 tpk < ϵp . (5) ACM Transactions on Sensor Networks, Vol. 13, No. 4, Article 29. Publication date: September 2017.
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有