Description
Date depot: 6 octobre 2021
Titre: Seeing and Being Seen: Modeling Social Attention for a Virtual/Physical Coach
Directrice de thèse:
Catherine PELACHAUD (ISIR (EDITE))
Directrice de thèse:
Laurence CHABY (ISIR (SMAER))
Domaine scientifique: Sciences et technologies de l'information et de la communication
Thématique CNRS : Robotique
Resumé: Context: Conversational agents can take on a human appearance and can communicate verbally or non-verbally. They can be used as an interface in human-machine interaction by playing multiple roles such as assistant, teacher, guide or companion. They have communication skills, i.e. they can interact with humans through verbal and non-verbal means of communication.
During an interaction, agents communicate through their facial expressions, gaze, posture, gestures. These behaviors can have several communicative functions such as managing turn-taking, highlighting important information, displaying an emotional state or a social attitude. The agent can express its commitment to the interaction by looking and expressing itself in response to the behaviors of the human interlocutor. The agent can indicate that its perceives, listens, agrees or disagrees. The agent can be either virtual (like the GRETA agent on the right of the figure) or physical (like the Furhat robot on the left of the figure).
Aim of the thesis:
The objective of this thesis is to equip the virtual/physical coach with social attention to give the human the feeling of presence and the feeling of being seen. To do so, the agent must be able to detect the direction of the human interlocutor's gaze as well as his/her non-verbal behavior, decide how to react and generate a multimodal response to enhance behavior changes. In this course we will focus particularly on the gaze.
After a review of the literature on social attention and gaze modeling in agents (Myllyneva & Hietanen, 2016; Ruhland et al, 2014), 3 main steps are envisaged:
1) Set up a platform to measure where humans look during interaction using OpenSource tools: OpenFace (Baltrusaitis et al., 2018), OpenSense (Stefanov et al., 2020) and/or OpenPose (Cao et al., 2018).
2) Modeling and setting up scenarios of coaches likely to evoke the feeling of being seen. The study scenario involves a human participant and a virtual/physical agent.
3) Experimental studies: The agent's behavior will be manipulated to measure its impact in the perception of the human participant as well as in the quality of the interaction. These measurements will be obtained through questionnaires.
The agent will be both virtual like the Greta agent (https://github.com/isir/greta) and physical like Furhat (https://furhatrobotics.com/).
References:
Baltrusaitis, T., Zadeh, A., Lim, Y. C., & Morency, L. P. (2018, May). Openface 2.0: Facial behavior analysis toolkit. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018) (pp. 59-66). IEEE.
Cao, Z., Hidalgo, G., Simon, T., Wei, S. E., & Sheikh, Y. (2018). OpenPose: realtime multi-person 2D pose estimation using Part Affinity Fields. arXiv preprint arXiv:1812.08008.
Ruhland K., Sean Andrist, Jeremy Badler, Christopher Peters, Norman Badler, et al.. Look me in the eyes: A survey of eye and gaze animation for virtual agents and artificial systems. Eurographics 2014 - State of the Art Reports, Apr 2014, Strasbourg, France. pp.69-91, 10.2312/egst.20141036. hal-01025241
Laidlaw, K. E., Foulsham, T., Kuhn, G., & Kingstone, A. (2011). Potential social interactions are important to social attention. Proceedings of the National Academy of Sciences, 108(14), 5548-5553.
Myllyneva, A., & Hietanen, J. K. (2016). The dual nature of eye contact: to see and to be seen. Social Cognitive and Affective Neuroscience, 11(7), 1089-1095.
Stefanov, K., Huang, B., Li, Z., & Soleymani, M. (2020, October). OpenSense: A Platform for Multimodal Data Acquisition and Behavior Perception. In Proceedings of the 2020 International Conference on Multimodal Interaction (pp. 660-664).
Syrjämäki, A. H., Isokoski, P., Surakka, V., Pasanen, T. P., & Hietanen, J. K. (2020). Eye contact in virtual reality–A psychophysiological study. Computers in Human Behavior, 112, 106454.
Doctorant.e: Younsi Nezih