UMR CNRS 7253

Site Tools


en:research

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
en:research [2018/01/05 17:33] ithouvenen:research [2023/09/05 17:09] (current) ithouven
Line 1: Line 1:
 ======Projets récents ou en cours======  ======Projets récents ou en cours====== 
 ===Projets nationaux=== ===Projets nationaux===
-**ANR SOCIAL TOUCH**: (Comprendre, concevoir et évaluer la modalité tactile dans les interactions homme-machine sociales)\\+**[[https://anr.fr/Projet-ANR-22-CE33-0010|ANR MATCH]]** (Modulations multisensorielles et affectives du toucher : interaction et intentionnalité. L'objectif est de comprendre la sensation d'être touché par un agent autonome incarné.)\\ 
 +Social touch is central to the interpersonal interactions of daily life. However, it is sorely lacking in interactions with virtual agents. No current technology is capable of satisfactorily generating the impression of being touched by a virtual agent during a social interaction in virtual reality (VR). Our goal is to meet this challenge by overcoming, as much as possible, the need for tactile stimulation. We will seek to determine the minimal conditions allowing the emergence of the impression of being touched, by studying the impact of multisensory integration, agency and social context. We will optimize the effect of gesture and contextual parameters on the perception of social touch. We will use cross-modal strategies to induce the illusion of touch in VR, through the substitution of other sensory modalities, such as audio. We will also manipulate the multimodal behavior of the virtual agent (facial expressions, gaze, gesture) to maximize the impression of touch and the associated affective reactions. 
 +\\ 
 +**[[https://socialtouch.hds.utc.fr/index.php/what-is-social-touch/publications|ANR SOCIAL TOUCH]]**: (Comprendre, concevoir et évaluer la modalité tactile dans les interactions homme-machine sociales)\\
 (ANR Société de l'information et de la communication 2018-2022)\\ (ANR Société de l'information et de la communication 2018-2022)\\
 This project is at the crossroad of Human-Machine Interaction (HMI) and Emotional Design. It investigates how the sense of touch can be integrated in interactive systems to leverage communicative and emotional channels between humans and machines or between humans via machines. It focuses  on a communicative modality that has been much less studied than other communication channels, namely touch, and addresses innovative topics such as the role of social touch to foster engagement or to provide intimate experience. It investigates how touch can serve to enhance human interaction with small devices, embodied conversational agents (ECAs) and Virtual Reality environments and for interpersonal mediated communication. The project aims: (1) To understand the principles and functions of touch as an emotional way to communicate and to predict its impact on human-machine engagement; (2) To design novel human-machine interaction techniques to improve user   experience; This project is at the crossroad of Human-Machine Interaction (HMI) and Emotional Design. It investigates how the sense of touch can be integrated in interactive systems to leverage communicative and emotional channels between humans and machines or between humans via machines. It focuses  on a communicative modality that has been much less studied than other communication channels, namely touch, and addresses innovative topics such as the role of social touch to foster engagement or to provide intimate experience. It investigates how touch can serve to enhance human interaction with small devices, embodied conversational agents (ECAs) and Virtual Reality environments and for interpersonal mediated communication. The project aims: (1) To understand the principles and functions of touch as an emotional way to communicate and to predict its impact on human-machine engagement; (2) To design novel human-machine interaction techniques to improve user   experience;
Line 9: Line 12:
 **ANR MAC COY CRITICAL**: (Models for Adaptative feedback enriChment and Orchestration based virtual realitY in Critical situations)\\ **ANR MAC COY CRITICAL**: (Models for Adaptative feedback enriChment and Orchestration based virtual realitY in Critical situations)\\
 (ANR FORMATION ET EDUCATION 2014 - 2019)\\ (ANR FORMATION ET EDUCATION 2014 - 2019)\\
-Je collabore avec Domitile Lourdeaux dans ce projet porté par Jean Marie Burkhardt (IFSTTAR-LPC) dans lequel la question des feedback adaptatifs est importante.\\+J'ai collaboré avec Domitile Lourdeaux dans ce projet porté par Jean Marie Burkhardt (IFSTTAR-LPC) dans lequel la question des feedback adaptatifs est importante.\\ 
 **FUI SERA**: Mobile and Interactive Augmented Reality for Driving Assistance  (Projet FUI 2012-2015) **FUI SERA**: Mobile and Interactive Augmented Reality for Driving Assistance  (Projet FUI 2012-2015)
 Ce projet, porté par Vincent Fremont au laboratoire, est une collaboration avec des industriels, le porteur de projet est VISTEON. Le sujet central du projet est la réalité augmentée pour la conduite, dans la suite des travaux de thèse de Minh Tien Phan.  Ce projet, porté par Vincent Fremont au laboratoire, est une collaboration avec des industriels, le porteur de projet est VISTEON. Le sujet central du projet est la réalité augmentée pour la conduite, dans la suite des travaux de thèse de Minh Tien Phan. 

User Tools