Thesis topic

Expression Mapping onto Avatars for Human-Agent Interactions and Extended Reality Applications

  • Type
    Doctorate

Description

One of the big pillars of the current industrial revolution is eXtended Reality (XR). This topic has attracted a lot of interest as much in academia as in the industry (Meta, Apple, Google, Nvidia, etc.), due to its potential socio-cultural and economical impact.

 

In the context of the collaborative Wal4XR project, which gathers 5 universities in Wallonia and Brussels around XR, this PhD thesis will focus on improving the retargeting of a user’s audiovisual expressions on a 3D avatar. The thesis will have two main objectives. On the one hand, it will focus on improving audiovisual expression detection systems in the context of XR applications with and without occlusions (presence or absence of an Head Mounted Display for scenarios such as virtual meetings). On the other hand, the aim is to explore flexible retargeting techniques for matching different users to different avatars as seamlessly as possible.

 

The aim here is to represent the user controlling the avatar as accurately as possible and to express their communication intention as precisely as possible. The most intuitive way of achieving this would be to automatically detect the user’s expressions and retarget them onto the avatar. This poses various problems depending on the medium: occlusions due to head-mounted devices, limitations of human-to-avatar retargeting technologies, etc. There is some interesting and promising work in the litterature that tackles these problems [1,2]. But there is still plenty of room for improvement due to problems such as the limitations of expression detection systems, the accumulation of errors in the re-targeting process, the lack of flexibility in retargeting methods, and the need to take multimodality into account in these systems (they are currently mainly focused on facial expressions).

 

[1] Purps, C.F., Janzer, S., Wölfel, M. (2022). Reconstructing Facial Expressions of HMD Users for Avatars in VR. In: Wölfel, M., Bernhardt, J., Thiel, S. (eds) ArtsIT, Interactivity and Game Creation. ArtsIT 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 422. Springer, Cham. https://doi.org/10.1007/978-3-030-95531-1_5
[2] J. Zhang, K. Chen and J. Zheng, “Facial Expression Retargeting From Human to Avatar Made Easy,” in IEEE Transactions on Visualization and Computer Graphics, vol. 28, no. 2, pp. 1274-1287, 1 Feb. 2022, doi: 10.1109/TVCG.2020.3013876.

 

Mission
  • Carry on research towards the above mentioned objectives
  • Generate resources displaying your research work and make it available to the community for reproducibility purposes
  • Participate in various meetings and events in the context of the Wal4XR project
Profile

You hold a Master in Computer Science, Electrical Engineering or equivalent, ideally focusing on one of the domains of interest here.

Required Skills

  • Comfortable working both autonomously as well as in a team
  • Able to adapt and learn new skills quickly
  • Good oral and written communication skills
  • Background or some expertise in more than one of the following domains and are interested in learning the others: machine learning (deep learning specifically), statistics, 3D modeling
  • Good programming skills in Python

Ideal Skills

  • Good programming skills in C# and C++
  • Proven record in research, software engineering, software development, machine learning, 3D modeling or XR related fields in general

Interested?
Send email with CV and motivation letter to Prof. T. Dutoit (thierry.dutoit @ umons.ac.be)

About this topic

Related to
Service
ISIA
Promoters
Thierry Dutoit
Kevin El Haddad

Contact us for more info