Table des matières

PhD Thesis

Cooperative Perception Integrity for Intelligent Vehicles

Manuscript available on HAL.
Supervised by Véronique Cherfaoui and Philippe Bonnifait, financed by a CNRS grant.

In order to navigate safely and comfortably, intelligent vehicles require highly reliable perception of their environment. Since on-board sensors are necessarily limited in range, and because their field of view can be obscured, an emerging solution is cooperative perception: vehicles share their perception with other vehicles via wireless communication.

Intelligent vehicles can thus communicate complex information over long distances. They see further and more completely than their sensors could ever allow. However, information from external sources must be treated with caution, as misleading information can lead to a dangerous situation. The sources of degradation of this information's “integrity” in the cooperative system must therefore be kept to a minimum. In this thesis, we study these sources and propose suitable methods for managing them and avoiding their propagation. Our work focuses in particular on the fusion of tracked objects, the representation of areas covered by perception systems and the management of trust attributable to other communicating agents.

In order to avoid underestimating the uncertainty linked to the state of perceived objects, we are studying data fusion filters capable of handling the information loops induced by exchanges. Our results on simulated data show that a split covariance intersection filter is a suitable method for this problem. Coupled with the parameter-tuning methodology we propose, this method also appears to outperform more conventional methods.

Next, we introduce a formalism for representing the areas covered by each sensor and the areas seen as free, in order to better merge the detected objects. This is the concept of evidential detectability grids, based on the theory of belief functions. These detectability grids make it possible to merge several points of view to obtain a global representation of the environment, while explicitly managing uncertainties.

Finally, we propose a method for each vehicle to elaborate a trust index on the other cooperative agents. It is based on an evidential tree combining several pieces of evidence, such as the consistency and concordance of the information received. The confidence index is then used to ensure that each vehicle reliably combines locally perceived information with that transmitted by other vehicles.

The performance of the global cooperative perception method is evaluated on real data obtained using three experimental vehicles equipped with omnidirectional LiDAR sensors. The corresponding data sets are made available to the scientific community.

Master

Vehicle localization using an HD vector map and a 3D LiDAR : implementation of a crosswalk detector and observation model.