Projet de recherche doctoral numero :2909

Description

Date depot: 1 janvier 1900
Titre: Perception of emotions by third parties in face to face interactions
Directrice de thèse: Catherine PELACHAUD (ISIR (EDITE))
Domaine scientifique: Sciences et technologies de l'information et de la communication
Thématique CNRS : Non defini

Resumé: The PhD work is set in the field of affective computing. It had two main aspects: the study of mental state attributions to perceived non-verbal behaviours and the contribution to the non-verbal communicational skills of embodied agents. First, for the human comprehension aspect, short audio-visual clips presenting a person in a conversational context with another human have been evaluated. Through a forced-choice questionnaire, students attributed mental states to the observed person. The questionnaire was based on a psychology theory, the Componential Appraisal (Scherer & Ellgring 2007) approach to emotions items and on the attribution of emotional labels. This appraisal theory enables the understanding of mental states in terms of successive evaluations (“appraisals”) and enables to predict the link between facial expressions and mental state attributions. In our case it was used to predict the intuitive answers of participants to the observed expressions. The results of our study showed some clearly significant correlations between different behaviours and appraisal and emotion attributions. As a second step, a virtual character, Greta, was used as a tool to validate the observations established in the first task and to enable to look at the causality of attributions by expressions. Facial muscle movements (as described in the Facial Action Coding System, Ekman, Friesen, Hager 2002) have been transposed to the MPEG-4 standard to be applied to our virtual agent. In our agent, expressions of emotions are no longer modelled as static expressions at their apex; but the system was improved (see Niewiadomski, Hyniewska, Pelachaud 2011) so that expressions could correspond to a temporally ordered sequence of multimodal signals. Short emotional expressions from the natural setting have been reproduced on the virtual character and some behavioural cues have been manipulated one by one to test their impact on third party attribution of mental states. A second study was run, showing these videos of Greta. These were judged by human participants with the same questionnaire as in the first part.

Doctorant.e: Hyniewska Sylwia