当前位置:高等教育资讯网  >  中国高校课件下载中心  >  大学文库  >  浏览文档

《分布式计算实验室》课程教学资源(阅读文献)I am the UAV:A Wearable Approach for Manipulation of Unmanned Aerial Vehicle

资源类别:文库,文档格式:PDF,文档页数:3,文件大小:295.47KB,团购合买
点击下载完整版文档(PDF)

I am the UAV:A Wearable Approach for Manipulation of Unmanned Aerial Vehicle Yijia Lut,Fei Hant,Lei Xief,Yafeng Yint,Congcong Shitt and Sanglu Lut iState Key Laboratory for Novel Software Technology,Nanjing University,China Global Energy Interconnection Research Institute Email:lyj@smail.nju.edu.cn,hfei@dislab.nju.edu.cn,lxie@nju.edu.cn,yyf@dislab.nju.edu.cn. shicongcong@geiri.sgcc.com.cn,sanglu@nju.edu.cn Abstract-Nowadays,Unmanned Aerial Vehicles(UAVs)have and a smartglass,we are able to accurately perceive the user's been widely applied in our life.However,the existing approach arm movements and body movements,including the moving of interacting with UAVs,i.e,using a remote controller with direction and distance,by leveraging accelerometers and gy- control sticks,is not a natural and intuitive way.In this paper,we present a novel approach for users to interact with roscopes.We use a smartphone as the server,which combines personal UAVs using wearable devices.The basic idea of our and optimizes the processed data from the smartwatch and the approach is to manipulate UAVs based on human activity sensing, smartglass,and then sends commands to the remote controller including motion recognition and pedestrian dead-reckoning. wirelessly to manipulate the UAV. We have implemented the proposed approach on a DJI drone and evaluated its performance in the real-world environment Realistic experiment results show that our solution can replace UAV the remote controller to manipulate the UAV. Smartglass I.INTRODUCTION Unmanned Aerial Vehicles(UAVs),commonly known as drones,are becoming increasingly popular and helpful in nartwatch our daily life.Nowadays,UAVs have been widely used in Smartphone numerous fields,such as package delivery,crop-dusting,film shooting,searching and rescuing.Nevertheless,to the best of Remote our knowledge,most of the commercial UAVs are manipulated controller with remote controllers.For example,one control stick of the Fig.1.Diagram of the proposed approach remote controller controls the UAV's up-and-down movements and rotations,while the other control stick changes the UAV's There are several challenges in designing a scheme for forward,backward,left and right pitch.This kind of manipu- manipulation of UAVs based on wearable devices.The first lation may be not intuitive,or even difficult for unprofessional challenge is that human-drone interaction requires higher users to learn. accuracy and lower latency than other human-computer in- Under the circumstance,Human-Drone Interaction(HDI) teractions,because the recognition error or intolerable delay has attracted much attention.The state of the art solutions may lead to the out of control or even crash of UAV.To in human-drone interaction mainly exploit machine vision address this challenge,we have particularly designed simple techniques.Nagi et al.present a method using onboard video and discriminable motions for manipulating UAVs based on cameras for humans to interact with UAVs,with the help of motion recognition.What's more,we have reduced the number face pose estimates and hand gestures[1].Pfeil et al.explore of motion templates to save the time cost of matching.The upper body 3D spatial interaction metaphors for control and second challenge is how to filter out unintentional motions communication with UAVs,using the Microsoft Kinect[2]. of users.To address this challenge,we make full use of the FollowMe leverages a quadrocopter to follow a person and consistency of inertial data on two wearable devices while recognize simple gestures using an onboard depth camera[3]. walking,in other words,only when the smartglass and the Unfortunately,machine vision based interaction is usually smartwatch detect the similar movement,the manipulation will limited to intensity of light and surrounding environments, work. which may have serious influence on the performance. We make the following main contributions in this demo The good news is that the development of wearable devices, paper:1)We propose a novel approach for manipulating such as smartwatches and smartglasses,provides us with new UAVs by using wearable devices.2)We design a complete chances to interact with UAVs conveniently and intuitively. human-drone interaction solution which can take the place In this paper,we propose a novel approach for human-drone of a remote controller in most cases.3)We implement the interaction based on sensing the user's activity.As shown in proposed approach on a DJI drone and two wearable devices, Fig.1,with the help of wearable devices like a smartwatch including one Google Glass and one Moto360 Smartwatch. 978-1-5090-6517-2/17/$31.00C2017EEE

I am the UAV: A Wearable Approach for Manipulation of Unmanned Aerial Vehicle Yijia Lu†, Fei Han†, Lei Xie†, Yafeng Yin†, Congcong Shi†‡ and Sanglu Lu† †State Key Laboratory for Novel Software Technology, Nanjing University, China ‡Global Energy Interconnection Research Institute Email: lyj@smail.nju.edu.cn, hfei@dislab.nju.edu.cn, lxie@nju.edu.cn, yyf@dislab.nju.edu.cn, shicongcong@geiri.sgcc.com.cn, sanglu@nju.edu.cn Abstract—Nowadays, Unmanned Aerial Vehicles(UAVs) have been widely applied in our life. However, the existing approach of interacting with UAVs, i.e., using a remote controller with control sticks, is not a natural and intuitive way. In this paper, we present a novel approach for users to interact with personal UAVs using wearable devices. The basic idea of our approach is to manipulate UAVs based on human activity sensing, including motion recognition and pedestrian dead-reckoning. We have implemented the proposed approach on a DJI drone, and evaluated its performance in the real-world environment. Realistic experiment results show that our solution can replace the remote controller to manipulate the UAV. I. INTRODUCTION Unmanned Aerial Vehicles(UAVs), commonly known as drones, are becoming increasingly popular and helpful in our daily life. Nowadays, UAVs have been widely used in numerous fields, such as package delivery, crop-dusting, film shooting, searching and rescuing. Nevertheless, to the best of our knowledge, most of the commercial UAVs are manipulated with remote controllers. For example, one control stick of the remote controller controls the UAV’s up-and-down movements and rotations, while the other control stick changes the UAV’s forward, backward, left and right pitch. This kind of manipu￾lation may be not intuitive, or even difficult for unprofessional users to learn. Under the circumstance, Human-Drone Interaction(HDI) has attracted much attention. The state of the art solutions in human-drone interaction mainly exploit machine vision techniques. Nagi et al. present a method using onboard video cameras for humans to interact with UAVs, with the help of face pose estimates and hand gestures[1]. Pfeil et al. explore upper body 3D spatial interaction metaphors for control and communication with UAVs, using the Microsoft Kinect[2]. FollowMe leverages a quadrocopter to follow a person and recognize simple gestures using an onboard depth camera[3]. Unfortunately, machine vision based interaction is usually limited to intensity of light and surrounding environments, which may have serious influence on the performance. The good news is that the development of wearable devices, such as smartwatches and smartglasses, provides us with new chances to interact with UAVs conveniently and intuitively. In this paper, we propose a novel approach for human-drone interaction based on sensing the user’s activity. As shown in Fig.1, with the help of wearable devices like a smartwatch and a smartglass, we are able to accurately perceive the user’s arm movements and body movements, including the moving direction and distance, by leveraging accelerometers and gy￾roscopes. We use a smartphone as the server, which combines and optimizes the processed data from the smartwatch and the smartglass, and then sends commands to the remote controller wirelessly to manipulate the UAV. Fig. 1. Diagram of the proposed approach. There are several challenges in designing a scheme for manipulation of UAVs based on wearable devices. The first challenge is that human-drone interaction requires higher accuracy and lower latency than other human-computer in￾teractions, because the recognition error or intolerable delay may lead to the out of control or even crash of UAV. To address this challenge, we have particularly designed simple and discriminable motions for manipulating UAVs based on motion recognition. What’s more, we have reduced the number of motion templates to save the time cost of matching. The second challenge is how to filter out unintentional motions of users. To address this challenge, we make full use of the consistency of inertial data on two wearable devices while walking, in other words, only when the smartglass and the smartwatch detect the similar movement, the manipulation will work. We make the following main contributions in this demo paper: 1) We propose a novel approach for manipulating UAVs by using wearable devices. 2) We design a complete human-drone interaction solution which can take the place of a remote controller in most cases. 3) We implement the proposed approach on a DJI drone and two wearable devices, including one Google Glass and one Moto360 Smartwatch. 978-1-5090-6517-2/17/$31.00 ©2017 IEEE

II.SYSTEM DESIGN TABLE I A.System Overview MOTIONS AND CORRESPONDING COMMANDS Motion Command In our approach,we focus on how to manipulate UAVs Take off based on the inertial sensors embedded inside the wearable Lateral raise twice quickly Rise(if has taken off) devices,specifically,one smartwatch and one smartglass.Fig.2 Lateral raise once slowly Descend shows the framework of our manipulation system. Stop rising(if rising) Lift the arm to the chest Smartglass Smartphone Drone Stop descending(if descending) Swing the arm forward and Gyroscope View Control Camera Rotate Take photo backward twice Accelerometer Walking Distance Pitch Smartwatch Walking Yaw training cost.Fortunately,Dynamic Time Warping(DTW)is Direction Rise&Descend able to solve this problem perfectly.In time series analysis, Motion Gyroscope Take photo DTW is an algorithm for measuring the similarity between Classifier two temporal sequences which vary in speed.DTW algorithm Fig.2.Framework of the proposed approach. utilizes dynamic programming to find the optimal match of The components of our system are as follows: two temporal sequences by calculating the distance between them. 1)Smartglass:The smartglass senses the movement of the user's head via an accelerometer and a gyroscope.On one hand,it detects the user's steps and turns while walking.On the other hand,it detects the variation of the user's horizon. The results will be sent to the smartphone through WiFi. 2)Smartwatch:The smartwatch senses the periodic move- ment of the user's arm and detects steps and turns while 80 100 120 140 Sample walking,based on the inertial sensor readings.Besides,it Fig.3.Different sequences of the same motion vary in speed. senses the user's appointed motions by matching the current motion with trained templates.The results will be sent to the As shown in Table I,there are four motions in our system. smartphone via Bluetooth. To use DTW,we first establish the template of each motion. 3)Smartphone:The smartphone works as a server,linking Each motion correlates with six kinds of sensor data,i.e., the two wearable devices and the UAV.It receives the recogni- 3-axis acceleration readings ar,ay,a and 3-axis angular tion results from the smartglass and the smartwatch,and then velocity readings wz,wy,w.Consequently,the template of determines whether to generate a flight command according each motion consists of the six kinds of sensor data.For to the consistency of the recognition results. simplicity,we only use one template for each motion. 4)UAV:The UAV gets commands from the smartphone, For the received sensor data of a potential motion Mi, and then follows the commands to pitch,roll,yaw,rise, we calculate the DTW distance between Mi and a motion descend,etc. template Ti,j[1,4],as shown in Eq.(1).Here,daz,day, daz,dwz,dwy,d mean the DTW distances in x-axis,y-axis, B.Motion Recognition z-axis of acceleration and gyroscope data,respectively. We design four motions to control the flight of the UAV. including taking off/rising,descending,stoping rising or de- DDTW dar day+daz dwz+dwy dwz (1) scending and taking photos.Table I shows the specific motions After calculating the DTW distance between the potential and the corresponding flight commands.The motions we motion Mi and each template Ti,j [1,4],we respectively designed are natural and simple,especially the motions for get four DTW distances DDTw1,DpTw2,DDTw3.DDTw4. rising and descending,which make the user feel like flying Then,we compare the four results and select the motion with wings. with smallest DTW distance as the recognition result of the When we perform the same motion twice,the amount of potential motion Mi inertial data in each motion period is different.However,the variation trend of the data is consistent.Fig.3 displays the C.Turn Detection variation of acceleration on one axis when performing the Although the information of the user's body turn can "rise"motion twice.We can find the two lines have different be extracted from the magnetometer,considering that most lengths but keep the same variation trend.Therefore,we wearable devices have no magnetometer,we detect turns using cannot directly calculate the distance between the received the gyroscope data in our approach.Turn detection is based sensor data and the trained motion templates.In regard to the on the fact that the rotation axis of the human body during a motion recognition methods using machine learning,it often turn is always along the direction of the gravity.However,the uses the training data to generate a classifier,leading to high gyroscope measures the angular velocities of rotation on each

II. SYSTEM DESIGN A. System Overview In our approach, we focus on how to manipulate UAVs based on the inertial sensors embedded inside the wearable devices, specifically, one smartwatch and one smartglass. Fig.2 shows the framework of our manipulation system. Fig. 2. Framework of the proposed approach. The components of our system are as follows: 1) Smartglass: The smartglass senses the movement of the user’s head via an accelerometer and a gyroscope. On one hand, it detects the user’s steps and turns while walking. On the other hand, it detects the variation of the user’s horizon. The results will be sent to the smartphone through WiFi. 2) Smartwatch: The smartwatch senses the periodic move￾ment of the user’s arm and detects steps and turns while walking, based on the inertial sensor readings. Besides, it senses the user’s appointed motions by matching the current motion with trained templates. The results will be sent to the smartphone via Bluetooth. 3) Smartphone: The smartphone works as a server, linking the two wearable devices and the UAV. It receives the recogni￾tion results from the smartglass and the smartwatch, and then determines whether to generate a flight command according to the consistency of the recognition results. 4) UAV: The UAV gets commands from the smartphone, and then follows the commands to pitch, roll, yaw, rise, descend, etc. B. Motion Recognition We design four motions to control the flight of the UAV, including taking off/rising, descending, stoping rising or de￾scending and taking photos. Table I shows the specific motions and the corresponding flight commands. The motions we designed are natural and simple, especially the motions for rising and descending, which make the user feel like flying with wings. When we perform the same motion twice, the amount of inertial data in each motion period is different. However, the variation trend of the data is consistent. Fig.3 displays the variation of acceleration on one axis when performing the ”rise” motion twice. We can find the two lines have different lengths but keep the same variation trend. Therefore, we cannot directly calculate the distance between the received sensor data and the trained motion templates. In regard to the motion recognition methods using machine learning, it often uses the training data to generate a classifier, leading to high TABLE I MOTIONS AND CORRESPONDING COMMANDS Motion Command Lateral raise twice quickly Take off Rise(if has taken off) Lateral raise once slowly Descend Lift the arm to the chest Stop rising(if rising) Stop descending(if descending) Swing the arm forward and Take photo backward twice training cost. Fortunately, Dynamic Time Warping(DTW) is able to solve this problem perfectly. In time series analysis, DTW is an algorithm for measuring the similarity between two temporal sequences which vary in speed. DTW algorithm utilizes dynamic programming to find the optimal match of two temporal sequences by calculating the distance between them. Sample 0 20 40 60 80 100 120 140 160 Acceleration -20 -10 0 10 20 test1 test2 Fig. 3. Different sequences of the same motion vary in speed. As shown in Table I, there are four motions in our system. To use DTW, we first establish the template of each motion. Each motion correlates with six kinds of sensor data, i.e., 3-axis acceleration readings 𝑎𝑥, 𝑎𝑦, 𝑎𝑧 and 3-axis angular velocity readings 𝑤𝑥, 𝑤𝑦, 𝑤𝑧. Consequently, the template of each motion consists of the six kinds of sensor data. For simplicity, we only use one template for each motion. For the received sensor data of a potential motion 𝑀𝑖, we calculate the DTW distance between 𝑀𝑖 and a motion template 𝑇𝑗 , 𝑗 ∈ [1, 4], as shown in Eq. (1). Here, 𝑑𝑎𝑥, 𝑑𝑎𝑦, 𝑑𝑎𝑧, 𝑑𝑤𝑥, 𝑑𝑤𝑦, 𝑑𝑤𝑧 mean the DTW distances in x-axis, y-axis, z-axis of acceleration and gyroscope data, respectively. 𝐷𝐷𝑇𝑊 = 𝑑𝑎𝑥 + 𝑑𝑎𝑦 + 𝑑𝑎𝑧 + 𝑑𝑤𝑥 + 𝑑𝑤𝑦 + 𝑑𝑤𝑧 (1) After calculating the DTW distance between the potential motion 𝑀𝑖 and each template 𝑇𝑗 , 𝑗 ∈ [1, 4], we respectively get four DTW distances 𝐷𝐷𝑇𝑊1, 𝐷𝐷𝑇𝑊2, 𝐷𝐷𝑇𝑊3, 𝐷𝐷𝑇𝑊4. Then, we compare the four results and select the motion with smallest DTW distance as the recognition result of the potential motion 𝑀𝑖. C. Turn Detection Although the information of the user’s body turn can be extracted from the magnetometer, considering that most wearable devices have no magnetometer, we detect turns using the gyroscope data in our approach. Turn detection is based on the fact that the rotation axis of the human body during a turn is always along the direction of the gravity. However, the gyroscope measures the angular velocities of rotation on each

axis of the device's body-frame.So it's necessary to transform the angular velocity from body-frame to earth-frame.We get the turn angle by calculating the integral of angular velocity 0.98 0.02 0.00 0.00 around the gravitational direction on the smartglass and the smartwatch,respectively. Descend 0.00 1.00 0.00 0.00 D.Step Detection First,we use a low-pass filter to remove high-frequency Stop 0.00 0.00 1.00 0.00 noise in the inertial data.Then we utilize a sliding window to detect steps.For the smartglass.the acceleration along the gravitational direction has obvious peaks and each peak Take photo 0.00 0.00 0.00 1.00 represents one step.as shown in Fig.4(a).While for the Rise Descend Stop Take photo smartwatch,it rotates around the shoulder joint when the arm swings forward and backward.So both of the peak and the Fig.5.Confusion matrix for four motions. valley in data represent one step,as shown in Fig.4(b). B.Accuracy of turn and step detection 25 Experiment results show that the accuracy of step detection of our approach is above 92%.As for the accuracy of turn detecion,the average error of calculated turning angles is F-2.5 within 5.The error of the UAV's turning angles is caused by two factors.First,when we send a specific value of angle 100 200 300 100 200 300 to the remote controller of the UAV.the UAV will not turn Sample Sample the actual value we give.Second,there exists some deviation (a)Smartglass (b)Smartwatch between the actual turning angle and the calculated turning Fig.4.Step detection. angle,which is derived by integrating the angular velocities. E.Consistency Judgement IV.CONCLUSION The two wearable devices detect user's turns and steps In this paper,we present a novel wearable approach for con- seperately,and then send the result to the server immediately. venient manipulation of UAVs.We use two common wearable Consequently,it is important to judge consistency in our devices embedded with inertial sensors to estimate the moving approach.We have two metrics for consistency judgement: distance and the direction of the user.Besides,we sense the 1)Magnitude.The angle of the turn received from two wear- user's arm motions with an accelerometer and a gyroscope able devices must be similar in magnitude.If the difference based on template matching.The realistic experiments show between the two magnitudes is larger than the threshold,we our approach is able to manipulate the UAV successfully. consider they are not consistent.2)Timestamp.When the ACKNOWLEDGMENT server receives the result from the wearable device,it will record the magnitude as well as the timastamp.In addition to This work is supported in part by National Natural Science magnitude,the difference in timestamp should also be under Foundation of China under Grant Nos.61472185,61373129, a threshold. 61321491,61502224;JiangSu Natural Science Foundation under Grant No.BK20151390.This work is partially sup- III.IMPLEMENTATION AND EVALUATION ported by Collaborative Innovation Center of Novel Software We have implemented the proposed approach using com- Technology and Industrialization.This work is partially sup- mercial devices[4].The UAV is DJI Phantom3 Professional, ported by 2016 Program A for outstanding PhD candidate of which allows developers to use the mobile SDK to create a Nanjing University under Grant No.201601A008.Lei Xie is customized mobile app.The mobile SDK provides interfaces the corresponding author. to flight and camera control of the UAV[5].The smart devices REFERENCES are Google glass and Moto360 smartwatch,both running on Android platform.We use a Huawei MT7-CL00 smartphone [1]Nagi,Jawad,et al."Human Control of UAVs using Face Pose Estimates and Hand Gestures."ACM/IEEE International Conference on Human- running Android 6.0 as the system server.The wearable Robot Interaction,2014. devices'sample frequency is set as 50Hz. [2]Pfeil,Kevin,L.K.Seng,and J.Laviola."Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles."International Conference A.Accuracy of motion recognition on Intelligent User Interfaces,2013. [3]Naseer,Tayyab,J.Sturm,and D.Cremers."FollowMe:Person following Fig.5 plots the confusion matrix for four motions.Each row and gesture recognition with a quadrocopter."IEEE/RSJ International represents the actual motions performed by the user and each Conference on Intelligent Robots and Systems,2013. [4]Demo of "I am the UAV",http://cs.nju.edu.cn/lxie/demo/uav.mp4 column represents the recognized motion.Each element in the [5]DJI Developer,https://developer.dji.com/mobile-sdk/. matrix corresponds to the probability of the motion in the row that is recognized as the motion in the column

axis of the device’s body-frame. So it’s necessary to transform the angular velocity from body-frame to earth-frame. We get the turn angle by calculating the integral of angular velocity around the gravitational direction on the smartglass and the smartwatch, respectively. D. Step Detection First, we use a low-pass filter to remove high-frequency noise in the inertial data. Then we utilize a sliding window to detect steps. For the smartglass, the acceleration along the gravitational direction has obvious peaks and each peak represents one step, as shown in Fig.4(a). While for the smartwatch, it rotates around the shoulder joint when the arm swings forward and backward. So both of the peak and the valley in data represent one step, as shown in Fig.4(b). (a) Smartglass (b) Smartwatch Fig. 4. Step detection. E. Consistency Judgement The two wearable devices detect user’s turns and steps seperately, and then send the result to the server immediately. Consequently, it is important to judge consistency in our approach. We have two metrics for consistency judgement: 1) Magnitude. The angle of the turn received from two wear￾able devices must be similar in magnitude. If the difference between the two magnitudes is larger than the threshold, we consider they are not consistent. 2) Timestamp. When the server receives the result from the wearable device, it will record the magnitude as well as the timastamp. In addition to magnitude, the difference in timestamp should also be under a threshold. III. IMPLEMENTATION AND EVALUATION We have implemented the proposed approach using com￾mercial devices[4]. The UAV is DJI Phantom3 Professional, which allows developers to use the mobile SDK to create a customized mobile app. The mobile SDK provides interfaces to flight and camera control of the UAV[5]. The smart devices are Google glass and Moto360 smartwatch, both running on Android platform. We use a Huawei MT7-CL00 smartphone running Android 6.0 as the system server. The wearable devices’ sample frequency is set as 50Hz. A. Accuracy of motion recognition Fig.5 plots the confusion matrix for four motions. Each row represents the actual motions performed by the user and each column represents the recognized motion. Each element in the matrix corresponds to the probability of the motion in the row that is recognized as the motion in the column. 0.98 0.00 0.00 0.00 0.02 1.00 0.00 0.00 0.00 0.00 1.00 0.00 0.00 0.00 0.00 1.00 Rise Descend Stop Take photo Rise Descend Stop Take photo Fig. 5. Confusion matrix for four motions. B. Accuracy of turn and step detection Experiment results show that the accuracy of step detection of our approach is above 92%. As for the accuracy of turn detecion, the average error of calculated turning angles is within 5∘. The error of the UAV’s turning angles is caused by two factors. First, when we send a specific value of angle to the remote controller of the UAV, the UAV will not turn the actual value we give. Second, there exists some deviation between the actual turning angle and the calculated turning angle, which is derived by integrating the angular velocities. IV. CONCLUSION In this paper, we present a novel wearable approach for con￾venient manipulation of UAVs. We use two common wearable devices embedded with inertial sensors to estimate the moving distance and the direction of the user. Besides, we sense the user’s arm motions with an accelerometer and a gyroscope based on template matching. The realistic experiments show our approach is able to manipulate the UAV successfully. ACKNOWLEDGMENT This work is supported in part by National Natural Science Foundation of China under Grant Nos. 61472185, 61373129, 61321491, 61502224; JiangSu Natural Science Foundation under Grant No. BK20151390. This work is partially sup￾ported by Collaborative Innovation Center of Novel Software Technology and Industrialization. This work is partially sup￾ported by 2016 Program A for outstanding PhD candidate of Nanjing University under Grant No. 201601A008. Lei Xie is the corresponding author. REFERENCES [1] Nagi, Jawad, et al. ”Human Control of UAVs using Face Pose Estimates and Hand Gestures.” ACM/IEEE International Conference on Human￾Robot Interaction, 2014. [2] Pfeil, Kevin, L. K. Seng, and J. Laviola. ”Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles.” International Conference on Intelligent User Interfaces, 2013. [3] Naseer, Tayyab, J. Sturm, and D. Cremers. ”FollowMe: Person following and gesture recognition with a quadrocopter.” IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013. [4] Demo of ”I am the UAV”, http://cs.nju.edu.cn/lxie/demo/uav.mp4. [5] DJI Developer, https://developer.dji.com/mobile-sdk/

点击下载完整版文档(PDF)VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
已到末页,全文结束
相关文档

关于我们|帮助中心|下载说明|相关软件|意见反馈|联系我们

Copyright © 2008-现在 cucdc.com 高等教育资讯网 版权所有