General Simulation Platform for Vision Based UAV

0 downloads 0 Views 1MB Size Report
simulation and realistic tests for ORB-SLAM and Open-TLD tracking ..... Versatile and Accurate Monocular SLAM System,” IEEE Transactions on. Robotics.
Proceeding of the 2015 IEEE International Conference on Information and Automation Lijiang, China, August 2015

General Simulation Platform for Vision Based UAV Testing Qing Bu, Fuhua Wan, Zhen Xie, Qinhu Ren, Jianhua Zhang*, Sheng Liu College of Computer Science Zhejiang University of Technology Hangzhou, Zhejiang Province, China * [email protected]

necessary to develop a simulation platform that can support open source autopilot for realistic tests. This motivates us to present a general simulation platform for vision based UAV simulation. This platform can be used for varieties of vision based UAV simulation and realistic tests. It is based on the simulator Gazebo [10] and ROS [11], both are the standard tools in robotics areas and contributed by lots of researchers all over the world. The main contribution of this work is to build a general simulation platform based on Gazebo and ROS, where a variety of vision algorithms can be implementary to control UAVs. For specific, our platform can support open source autopilots, AR.Drone and Pelican (a famous product of Ascending Technologies). We have already executed simulation and realistic tests for ORB-SLAM and Open-TLD tracking algorithms with AR.Drone on our simulation platform. We have a quadrotor with Pixhawk autopilot connected to our general simulation platform and it can be controlled by joysticks and algorithms.

Abstract – The main purpose of this article is to build a general platform for Unmanned Aerial Vehicle (UAV) simulation especially for vision algorithms. This general simulation platform can also connect to realistic UAVs for further tests. The platform enables UAVs controlled by a variety of advanced vision algorithms to accomplish certain tasks. Unlike the previous simulation systems, which can only achieve tasks with certain drones, we are committed to build a platform to support a variety of vision based UAVs simulation platform. And if the simulation test is successful, we can apply vision algorithms to realistic UAVs’ tests based on our platform with little adaption. We use open source Gazebo as a simulator and combined with ROS (Robot Operating System) to implement UAVs simulation. Besides, we build a general UAVs link interface to connect realistic UAVs with ROS. All these works construct the general simulation platform. Keywords - UAV, ROS, General Simulator Platform

I. INTRODUCTION UAVs have been wildly used in daily life and research areas in these years. More and more researchers are showing a growing interest in vision based UAV control algorithms, e.g. [1], [2] and [3]. The performance of these algorithms is impressive, such as AR.Drone vision based object tracking proposed in [4], and AR.Drone using PTAM mapping proposed in [5]. However, they can only achieve vision algorithms with AR.Drone in realistic environment. When testing prototype vision algorithms for UAV control, UAVs may easily hurt people around or be damaged. Because of the complexity and danger of realistic UAV testing, we need to execute simulations before realistic tests. Fortunately, there are already some UAV simulation platforms, such as [6], [7]. One is TUM's AR.Drone Simulation. This simulator supports the AR.Drone as the only UAV for simulation and it can connect to realistic AR.Drone for tests with a little adaption. But if we want to use other UAVs for tests, this simulator is no longer suitable. Some other simulation platforms can provide a simulation environment but we cannot use its environment to implement realistic UAV tests, such as [8]. As open source UAV autopilots are of great flexibility and versatility, many researchers are use open source fautopilots, such as Pixhawk (started by Lorenz Meieras in ETH) [9]. It is

978-1-4673-9104-7/15/$31.00 ©2015 IEEE

Ⅱ. GENERAL SIMULATION PLATFORM A. Platform Framework Our general platform framework is combined with ROS, ROS Gazebo, visional algorithms, and realistic UAVs. ROS is a software package that allows for the transport of sensor and control data via "topics" using the publisher/subscriber message passing model. Gazebo is a multi-robot simulator for outdoor environments. Like Stage [12], it is capable of simulating a population of robots, sensors and objects, but does so in a three-dimensional world. It generates both realistic sensor feedback and physically plausible interactions between objects. Our general framework shown in Figure 1, as the figure shows, our framework is divided into two parts, one is simulation layer and the other is realistic layer. Next we will introduce these two parts respectively. B.

Design of Simulation Layer Gazebo provides a multi-robot simulation environment including dynamics simulation, which is provided by the ODE or bullet physics engines. The simulator considers gravity, contact forces and friction by its own, it does not cover aerodynamics and propulsion systems that are especially

2512

required for aerial vehicles. The plugin system enables users to add custom controllers for simulated robots and sensors or to influence the environment. Therefore our work is to create the appearance of models and sensor/controller simulation plugins for different UAVs.

x and v are the position and velocity of the UAV’s gravity

center in the inertial coordinate system. θ and ω are the angle and angular rate of the UAV in body coordinate system. T is the rotation matrix that transforms a vector from body to inertial coordinates system. The mass M and inertia J of the UAV have been determined by the model components weight and geometries. One of the main advantages of the UAVs’ concept is the simplicity of its propulsion and steering system, consisting only of four independent motors and propellers with fixed pitch, where each pair of opposite propellers rotates in one direction to avoid yaw torque during roll and pitch movements. As a result, the overall system dynamics is mainly determined by the thrust and torque induced by the individual motor/propeller units. Sensor Simulation As attitude, position and velocity cannot be measured directly, accurate models are needed to simulate the signals from various sensors needed for estimating the state of the UAV. These sensors have been implemented as independent Gazebo plugins and can be attached to the model by including them in the robot URDF description. The plugins accept parameters covering the error characteristics, altitude and orientation of the Gazebo reference frame in the world coordinate system wherever necessary.

C. Design of Realistic Test Frame We use realistic UAV to test the algorithm after verifying the feasibility of algorithms with the general simulation platform mentioned above. For realistic test, supporting popular UAV is the core idea of our general platform. Most popular UAVs used by researches are some open source autopilots with varieties of frame structures like Pixhawk (started by Lorenz Meieras in ETH)[9] and some commercial UAV like AR.Drone(a famous product of Parrot) and Pelican(a product of Ascending Technologies).We build our own general UAV link interface for all of these UAVs. In this interface, we integrated mavros, AR.Drone driver and Asctec driver as the bridge of UAV and ROS. In fact, mavros is the commun-ication bridge between the MAVLink[13] protocol which is wildly used by many popular open source autopilots and ROS topics. That means our general simulation platform can be compatible with other popular open source autopilots, for example: APM (ArduPilot Mega) and etc. With open source autopilot, we can use varieties of frame structures for different applications easily. Both popular commercial UAVs and open source UAVs can be used to realistic vision algorithms test on our general simulation platform.

Fig.1 The framework of our simulation system.

Geometry The robot geometry has been modeled using the open source software Blender. For providing different colors (both texture or material based) for the model, the visual geometry is provided using the COLLADA format, while the collision geometry is modeled as a “.stl” mesh. The model is designed to have a low polygon count and still retain the relevant aspects of UAVs’ geometry. Dynamics Model The position and the movement of the UAV can be described by all the force F and all torque M acting on the UAV.

2513

Ⅲ. EXPERIMENTS A. Simulation We had tested some visual algorithms on UAVs in simulation platform. We selected two advanced vision algorithms currently, one named ORB-SLAM [14] [15], it is a Monocular SLAM system that operates in real time, in small and large, indoor and outdoor environments, with the capability of wide baseline loop closing and relocalization, and including full automatic initialization. The other named OpenTLD [16], it can robustly track objects in the drone’s video stream without any previous knowledge of the target. We put these two algorithms into our simulation platform, as shown in figure 1(module computer vision algorithm), and customized the controller. We select AR.Drone quadrotor as our test UAV in simulator system.

Fig.4 The view of front camera of quadrotor, the green points are ORB features.

Then through the front camera of quadrotor as the images input, new feature points are stored in bag of words, so next time it flies into the same position, it will update the location information in the map by itself, as shown in figure 3 and 4.

UAV Mapping Simulation We tried quadrotor based simulation test in ROS + Gazebo framework. In this platform we combined quadrotors with the relatively novel SLAM algorithm named ORBSLAM. With this algorithm, quadrotors can be located in realtime. And when the closed loop was detected, previous path would be optimized automatically. So the scene should be put some objects to facilitate the detection of ORB features, as shown in Figure 2. We control the PS3 controller model aircraft through the left two buildings around a few times. Fig.5 The map has been created, the blue rectangles are the current position of quadrotor, the green lines are the flight paths of quadrotor, the red points are ORB feature points observed in the current perspective, the gray points are the feature points which saving before.

When the quadrotor fly around two houses on the left in a circle, loop closure will be detected, then it can be positioned on the map more quickly. UAV Tracking Simulation The second simulation experiment we have tested the quadrotor tracking moving objects using OpenTLD, tracking algorithm framework cited ASU [4] [17], the simulation platform is same as the previous experiment, results are shown in figure 6 and 7.

Fig.2 The whole scene we design for quadrotor to simulate

Fig.3 The current scene when quadrotor flying on current position Fig.6 The scene is same as the experiment before except adding a moveable Turtlebot, quadrotor can automatically follow it in its rearward movement.

2514

Fig.7 Image view of the front camera of quadrotor, more detail about the visional algorithm reference literature [4] [17].

The result of second experiment is that the quadrotor follow behind of Turtlebot. Actually it is able to follow and stabilize itself and it is able to track a large variety of different objects. B. Reality We achieve vision algorithms test with realistic UAV on our general simulation platform. In realistic tests for AR.Drone, we use keyboard of laptop to take off the AR.Drone 2.0, and manually selecting the bounding box of the object (in our experiment we select a bag, as shown in figure 8), then start automatically tracking. In addition, when another man goes through the front of AR.Drone’s camera temporarily, it also can find back the target object and keep tracking. In realistic tests for Pixhawk, we use radio control to take off the quadrotor and toggle to the “off board” mode manually. Supported by the autopilot, we can control the aircraft through wireless data transmission modules. All data obey MAVLink protocol. We need to send control signals to quadrotor first such as Set_Position_Target_Local_Ned(a message of MAVLink) and keep the frequency of signals higher than 2Hz, before we toggle to the ”off board ” mode. The quadrotor will be taken over by joysticks and vision algorithms in “off board” mode. We use radio control as an initialization way and also a manual control in case the vision algorithms break down. The realistic test platform is shown in figure 9. As for the “Pelican” (the produce of the AscTec), we use the API(Application Programming Interface) provided by the AseTec as the driver between ROS and “Pelican”. The “Pelican” is shown in figure 10.

Fig.9 Realistic test platform, quadrotor with PIXHAWK autopilot

Fig.10 Realistic test platform, quadrotor of AscTec

Ⅳ.FUTURE WORK In the future we have to work with two aspects, first it is to increase the variety of other aircraft which is support by ROS to our simulation system. Simulation platform increases scalability, and it can also expand to UGV direction, so that it can test vision algorithms for UGVs on simulation platform. Then add more visual algorithm to platform, so that UAVs or UGVs can achieve more vision-based functions. The second aspect is to improve the connection between real UAVs and simulate ones. Our goal is supporting more UAVs and built more generic and more comprehensive function in the future. Ⅴ. CONCLUTION In this work, we mainly build a more general simulation platform which can test vision algorithms for different UAVs in a simulated environment. Thus it can avoid unnecessary damage when testing machine, and we can safely test vision algorithms in the simulator. Several tests have been carried out on our platform, and the results are satisfactory, as well as in the real environment. ACKNOWLEDGMENT This work was supported in part by HongKong, Macao and Taiwan Science & Technology Cooperation Program of China (2014DFH10110), National Natural Science Foundation of China (No. 61305021, 61325019, 61173096), Zhejiang Provincial Department of Science and Technology Public

Fig.8 AR.Drone automatic tracking people in real world.

2515

Technology Research Industrial Project (No. 2014C31102), and Doctoral Fund of Ministry of Education (No. 20133317120003). REFERENCES [1] Tomas, et al, “AR-Drone as a Platform for Robotic Research and Education,” Research and Education in Robotics - EUROBOT 2011, Communications in Computer and Information Science Volume 161, 2011, pp 172-186 [2] C. Bills, J. Chen, and A. Saxena, “Autonomous MAV Flight in Indoor Environments using Single Image Perspective Cues,” Robotics and Automation (ICRA), 2011 IEEE International Conference on, 9-13 May 2011, pp. 5776 – 5783. [3] J. Pestana, et al, “A Vision-based Quadrotor Swarm for the participation in the 2013 International Micro Air Vehicle Competition,” Unmanned Aircraft Systems (ICUAS), 2014 International Conference on, 27-30 May 2014, pp. 617 – 622. [4] J. Pestana, et al, “Computer Vision Based General Object Following for GPS-denied Multirotor Unmanned Vehicles," American Control Conference (ACC), 2014, June 4-6, 2014, pp. 1886 - 1891. [5] J. Engel, J. Sturm, D. Cremers, “Accurate Figure Flying with a Quadrocopter Using Onboard Visual and Inertial Sensing,” In Proc. of the Workshop on Visual Control of Mobile Robots (ViCoMoR) at the IEEE/RJS International Conference on Intelligent Robot Systems (IROS), 2012. [6] A. Santamaria-Navarro, V. Lippiello and J. Andrade, “Task Priority Control for Aerial Manipulation,” Safety, Security, and Rescue Robotics (SSRR), 2014 IEEE International Symposium on, 27-30 Oct. 2014, pp. 16. [7] P. Benavidez, J. Lambert, A. Jaimes and M. Jamshidi, “Landing of an AR.Drone 2.0 Quadcopter on a Mobile Base Using Fuzzy Logic,” World Automation Congress (WAC), 2014, 3-7 Aug. 2014, pp. 803 - 812. [8] Johannes Meyer, et al, “Comprehensive Simulation of Quadrotor UAVs using ROS and Gazebo,” Simulation, Modeling, and Programming for Autonomous Robots Lecture Notes in Computer Science Volume 7628, 2012, pp 400-411. [9] Meier, L. “PIXHAWK: A system for autonomous flight using onboard computer vision,” Robotics and Automation (ICRA), 2011 IEEE International Conference on, 9-13 May 2011, pp. 2992 – 2997. [10] A. H. N. Koenig. (2013). “gazebo - ROS Wiki”. Available: http://www.ros.org/wiki/gazebo [11] M. Quigley, B. Gerkey, K. Conley, J. Faust, T. Foote, J. Leibs, E. Berger, R. Wheeler, and A. Ng, “ROS : an open-source Robot Operating System,” in IEEE International Conference on Robotics and Automation (ICRA 2009), no. Figure 1, 2009. [12] “Playerstage - Stage”. Available: http://playerstage.sourceforge.net/index.php?src=stage [13]“Pixhawk - Mavlink”. Available: https://pixhawk.ethz.ch/mavlink/ [14]R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, “ORB-SLAM: a Versatile and Accurate Monocular SLAM System,” IEEE Transactions on Robotics. Under Review. [15]Raúl Mur-Artal and Juan D. Tardós. “ORB-SLAM: Tracking and Mapping Recognizable Features,” Robotics: Science and Systems (RSS) Workshop on Multi VIew Geometry in RObotics (MVIGRO), Berkeley, USA, July 2014. Oral presentation [16]G. Nebehay, “Robust Object Tracking Based on Tracking- LearningDetection,” Master’s thesis, Vienna University of Technology, Austria, 2012. [Online].Available:http://gnebehay.github.io/OpenTLD/gnebehay thesis msc.pdf [17]Jesus Pestana, et al, “Vision based GPS-denied Object Tracking and Following for Unmanned Aerial Vehicles,” SSRR 2013, pp.1-6.

2516