UMR CNRS 7253

Site Tools


en:embedded_vision_localization_and_control_of_an_aerial_drone

Guillaume SANAHUJA



PhD title:

Embedded vision localization and control of an aerial drone
(Commande et localisation embarquée d’un drone en utilisant la vision)


Co-advisor: Isabelle FANTONI
Grant: MERNT
Location: Heudiasyc
Date PhD finished: January 29th, 2010

Current position: IR at Heudiasyc Laboratory, France


Abstract

This work is about obtaining nonlinear control laws to stabilize an UAV (Unmanned Air Vehicle), and also about localization using vision. First, a bibliographic study over UAVs and their control laws was done. Thus several laws were tested and compared on a PVTOL (Planar Vertical Take Off and Landing) airplane platform. A control law stabilizing a system with n integrators based on saturation functions was proposed. The particularity of this law is that each state is separated.

Then a quadrirotor platform was built, as well as a simulator for the quadrirotor. This one allows to run the UAV’s program on a computer thanks to a dynamical model and can provide the vision algorithms to be tested with virtual images. Some computer vision algorithms were tested. In order to deal with the embedded criteria, a light optical flow method (working in only one direction) was implemented on a mobile robot to avoid obstacles. Then, stereovision based solutions using laser pointers were proposed. The first technique allows to estimate the attitude and the second one allows to do wall following.

Finally, a control scheme using a predictor and an observer was studied. This combination takes into account some delays in the control loop, which can come from the video processing. Moreover, the proposed scheme allows to efficiently use a captor with a high sample time (like camera) whereas the control law is running faster.





User Tools