My research mainly concerns uncertainty modeling and treatment with the help of imprecise probabilistic approaches (Lower previsions theory, Dempster-Shafer theory, Possibility theory). Roughly speaking, these approaches propose to blend interval and probabilistic methods to deal with situations of severe uncertainty (missing or imprecise data, few available information, expert opinions, non-reliable information, …).
Inside these theories, most of my research energy is spent in proposing practical solutions to various problems using these methods or in linking the different solutions each theory proposes to solve common problems (information fusion, independence modeling, …).
Work (mainly) benefiting from collaborations and discussions with D. Dubois, M. Troffaes, E. Miranda, L. Utkin, E. Chojnacki, E. Quaeghebeur and I. Sanchez
There exist many practical representations in imprecise probability theories, including possibility distributions, belief functions, imprecise probability assignments, pari-mutuel models, imprecise cumulative distributions (p-boxes), clouds, …
Aside establishing new relations between the properties of several models (i.e. clouds, p-boxes, possibility distributions), we have proposed a model called generalized p-box that models uncertainty by probabilistic bounds over collection of nested sets. Such models appear naturally in elicitation procedures or statistical confidence structures, and first results indicates that generalized p-boxes may be an interesting non-parametric model to handle multivariate problems and/or to handle bipolar information.
Our latest research on the topic include the specification of p-boxes limitations, as well as the investigation of some specific cases of comparative probabilities (where only singletons probabilities are qualitatively compared).
Work (mainly) benefiting from collaborations and discussions with D. Dubois, P. Buche, B. Charnomordic, E. Chojnacki, R. Thomopoulos, F. Sais, T. Burger and F. Pichon
Merging information from multiple sources is a recurring problem in modern systems. Common problems encountered by such merging are to cope with dependent and conflicting sources and to take account of sources characteristics (reliability, propensity to lie, precision, conflict level, …).
For practical purposes, we have proposed to use the notion of maximal coherent subsets as a way to deal with conflict among sources, and have applied it to the problem of estimating source reliability form meta-information.
Our current focus in this area concerns the characterization of inconsistency and reliability degrees resulting either from assumptions about the source of information or from the combination of the different pieces of information. Such degrees can then be used to guide the fusion process.
Work (mainly) benefiting from collaborations and discussions with D. Dubois, G. De Cooman, E. Chojnacki, J. Baccou, T. Burger, M. Sallak, M.C.M. Troffaes, F. Coolen, S. Ferson, F. Aguirre and I. Sanchez
How to propagate uncertainty analysis in various models is an important issue that may face several difficulties. Most of my research in this domain has concerned the propagation of uncertainty model through deterministic functions with methods combining Monte-Carlo simulation and interval analysis, with an industrial risk-assessment purpose.
Related problems are how to model independence to obtain tractable joint models (and how to compute with such latter models), or how to simulate a given imprecise probabilistic model.
Some of our recent research also deals with the problem of how to efficiently evaluate the reliability when the component reliabilities are uncertain and modelled by imprecise probabilistic knowledge (more speicifically belief functions).
Work (mainly) benefiting from collaborations and discussions with B. Quost, T. Denoeux, B. Ben Yaghlane, N. Sutton-Charani, G. Yang, M. Masson, E. Hüllermeier, A. Antonucci, G. Corani, M. Poss and N. Ben Abdallah
Outside of extending some classical classifiers (k-NN methods, Naïve networks) to imprecise probabilistic settings, our work currently focuses on the combination of classifiers, to address both the usual multi-classification problem, as well as more complex problems such as label ranking and multilabel classification.
One of our current favorite field of investigation is the so-called binary decomposition, where complex problems are decomposed in several binary ones (facilitating the learning but increasing the number of models to learn). In the future, we plan to focus more on active learning topics, where imprecise probabilistic methods can have an important role to play, due to their ability to identify cases where information is missing.
Work (mainly) benefiting from collaborations and discussions with P. Buche, B. Charnomordic, O. Strauss, V. Guillard, E. Chojnacki, M. Sallak, I. Thouvenin
We have applied ideas coming from imprecise probability theories and more generally concerning uncertainty handling to a number of frameworks, including: