UMR CNRS 7253

Site Tools


en:localisation_et_commande_embarquee_d_un_drone_en_utilisant_la_vision_stereoscopique

José Ernesto GOMEZ BALDERAS (HDS)



PhD title:

Localisation et commande embarquée d’un drone en utilisant la vision stéréoscopique


Co-advisor: Rogelio LOZANO
Grant from Mexicain Government
Location: Heudiasyc
Date PhD finished: November 28th, 2011

Current position: MdC at GIPSA Lab since 2013


Abstract

Visual servoing is a control approach based on visual information. In this thesis, visual servoing schemes are proposed to control a quadrotor and an octarotor applied to positioning and navigation task. Concerning the quadrotor we use a hierarchical control scheme whose inner-loop (fast dynamic) focuses on attitude dynamics, while outer-loop (slow dynamics) deals with translational dynamics.

Also, a nonlinear controller based on separated saturations for a quadrotor is proposed to stabilize it attitude. The linear position and velocity of the rotorcraft are obtained by using a vision-based algorithm via a monocular caméra. The dynamic model of the quadrotor is presented using the Newton-Euler formalism. In other vision system, two cameras are used to estimate the translational position and velocity of the vehicle. Position was obtained using a frontal camera looking at a target placed on a wall. Quadrotor velocity was estimated using a camera pointing vertically downwards running an optical flow algorithm. Experimental tests showed that the quadrotor performed well at hover flight using the proposed vision based control system.

Quadrotor vision-based

The same system was used to estimate the 3D position of the quadrotor over a trajectory using vanishing points. The performance of the vision and control algorithms has been tested in a real application by a quadrotor tracking a line painted in a wall. Similarly the velocity estimation is obtained using an optic flow algorithm. The estimated position and velocity information obtained from the vision system is combined with the angular rates and displacements of the inertial measurement unit to compute the control inputs. It has been shown that the proposed control scheme achieves the tracking objective of the visual reference.

Octarotor vision-based

In this thesis, it is presented a visual feedback a control of an octarotor using image-based visual servoing (IBVS) with stereo vision. Autonomous control of an UAV requires a precise measurements and/or estimation of the vehicle’s pose and also the knowledge of its surrounding environment. In order to control the orientation and the position of flying robot with respect to a target, we propose to use a navigation system based on binocular vision system combined with inertial sensors. This combination of sensors, allows us to get a complete characterization of the state of aerial vehicle. In other words, using the stereo vision system we are able to estimate the UAV’s 3D position, while from the inertial sensors we obtain the orientation of rotorcraft. A semi-embedded navigation system combining stereo vision with inertial information is proposed.

The hierarchical control approach is appropriate to stabilize the 6DOF dynamics of the quadrotor, it takes advantage of the time scale separation between rotational (fast) and translational (slow) dynamics. For this reason, despite the lower frequency rate of vision-based measurements is able to stabilize in real-time the quadrotor translational dynamics. This combination of measurement strategies has many advantages because one works very well at low speeds (vision system) and the other at high speeds (inertial sensors). Both work at different sample rate. Taking advantage of this property we have obtained a simplified dynamical model of the rotorcraft. This model is given by six independent double integrators which have been stabilized using proportional-derivative (PD) control. The real-time experiments have shown an acceptable performance of the flying machine applying the control law and sensing system proposed.

An embedded control system for the mini rotorcraft is implemented. The control is validated by experimental tests. Experimental results show that the implementation of the control law on an embedded control system is satisfactory for autonomous hovering in indoors and outdoors with light or no wind. Real time experiences are developed to validate the performance of navigation systems proposed. This work highlights the potential of the computer vision based position control strategies for UAV.





User Tools