Sharing physiological cues to enhance XR collaboration
Eligible for funding* | PhD
In this project, we are interested in how sharing physiological cues such as heart rate in collaboration VR or AR could improve the user experience. Previous research has shown a positive impact from physiological cues in video conferencing. Users can infer their partner’s emotional and cognitive state from the physiological signals they see them displaying. However, there has been very little research regarding how these cues can be conveyed in collaborative AR or VR settings.
We have previously developed a VR system that enabled sharing of physiological cues, such as heart rate, cognitive load, and attentional state, among remote collaborators, thereby facilitating a more comprehensive mutual understanding. Physiological cues such EEG, heart rate and EDA are measuring in real time using body worn sensors. These are used to create virtual cues in the VR environment to show the collaborators cognitive load, stress level, and attention.
Using this system, we conducted an empirical study to test the hypothesis that visualizing physiological cues would significantly impact collaborators’ empathy. Our approach used the ability of VR to convey natural nonverbal communication cues and physiological signals, thereby providing a more informative experience compared to traditional remote collaboration platforms.
This project will continue this work, but extend it in several ways, including:
- Analyzing conversational patterns, which could provide insights into how the conversation changed in response to perceived physiological cues.
- Use machine learning algorithms to automatically detect and respond to changes in users’ physiological states
- Explore further how to represent physiological cues so that they are best understood by the remote collaborators, and explore the impact of different representations
- Optimize the presentation and interpretation of shared physiological cues to enhance user engagement and attention
- Explore how physiological cues can be used in AR to enhance face to face collaboration
Desired skills
Experience with AR/VR interface development, designing user studies, interest in processing physiological data and emotion recognition.
Contact and supervisors
For more information or to apply for this project, please follow the link to the supervisor below:
Contact/Main supervisor
Supporting Supervisor
- Kunal Gupta
Eligible for funding*
This project is eligible for funding but is subject to eligibility criteria & funding availability.
Page expires: 20 March 2025