Thesis topic

Expression Mapping onto Avatars for Human-Agent Interactions and Extended Reality Applications

  • Type
    Doctorate

Description

One of the big pillars of the current industrial revolution is eXtended Reality (XR). This topic has attracted a lot of interest as much in academia as in the industry (Meta, Apple, Google, Nvidia, etc.), due to its potential socio-cultural and economical impact.

 

In the context of the collaborative Wal4XR project, which gathers 5 universities in Wallonia and Brussels around XR, this PhD thesis will focus on improving the retargeting of a user’s audiovisual expressions on a 3D avatar. The thesis will have two main objectives. On the one hand, it will focus on improving audiovisual expression detection systems in the context of XR applications with and without occlusions (presence or absence of an Head Mounted Display for scenarios such as virtual meetings). On the other hand, the aim is to explore flexible retargeting techniques for matching different users to different avatars as seamlessly as possible.

 

The aim here is to represent the user controlling the avatar as accurately as possible and to express their communication intention as precisely as possible. The most intuitive way of achieving this would be to automatically detect the user’s expressions and retarget them onto the avatar. This poses various problems depending on the medium: occlusions due to head-mounted devices, limitations of human-to-avatar retargeting technologies, etc. There is some interesting and promising work in the litterature that tackles these problems [1,2]. But there is still plenty of room for improvement due to problems such as the limitations of expression detection systems, the accumulation of errors in the re-targeting process, the lack of flexibility in retargeting methods, and the need to take multimodality into account in these systems (they are currently mainly focused on facial expressions).

 

[1] Purps, C.F., Janzer, S., Wölfel, M. (2022). Reconstructing Facial Expressions of HMD Users for Avatars in VR. In: Wölfel, M., Bernhardt, J., Thiel, S. (eds) ArtsIT, Interactivity and Game Creation. ArtsIT 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 422. Springer, Cham. https://doi.org/10.1007/978-3-030-95531-1_5
[2] J. Zhang, K. Chen and J. Zheng, “Facial Expression Retargeting From Human to Avatar Made Easy,” in IEEE Transactions on Visualization and Computer Graphics, vol. 28, no. 2, pp. 1274-1287, 1 Feb. 2022, doi: 10.1109/TVCG.2020.3013876.

About this topic

Related to
Service
ISIA
Promoters
Thierry Dutoit
Kevin El Haddad

Contact us for more info