Empathic Computing Laboratory
The Empathic Computing Lab’s mission is to develop software systems that allow people to share with others what they’re seeing, hearing and feeling.
This is one of ABI’s newest labs, born in 2018. Members of the Empathic Computing Lab use emerging hardware technologies to access what people are experiencing. It is led by world-renowned augmented reality expert Professor Mark Billinghurst.
An example of technology with an empathic element is a headset that has the capacity to relay emotional and physiological information about the wearer, such as their facial expression and heart-rate. Facial expression technology currently exists in AffectiveWear, glass with small photo-sensors which measure the distance from the glass frame to the skin. These measurements change when we smile, frown or gasp.
“We wanted to relate the cues you usually have in face-to-face conversation, as these show understanding. Using our technology, it all becomes a much more immersive experience. You feel as though you’re standing inside the body of the person wearing the headset… In trials of sharing eye gaze, we’ve found people experience a stronger sense of collaboration and communication,” says Professor Mark Billinghurst.
Our research
The work conducted in the Empathic Computing Lab is at the junction of three computer interface trends.
These trends are:
- The way we capture content, which has advanced from still photography in the 1850s to today’s streaming 360-degree video on a portable machine
- Increasing bandwidth, which allows you to download a movie in seconds and do higher-quality video conferencing
- ‘Implicit understanding’ where computers are able to watch and listen to us in order to understand what we are doing e.g. being able to play games on a console by moving your body around. Professor Mark Billinghurst explains: “So to some extent computers have become more human-like. They can recognise our behaviour.”