Projects

Some of the projects at the Empathic XR Lab undertaken by our students at various levels – honors, masters, and Ph.D.

  • 2018 -
  • Biofeedback in Virtual Reality

    Virtual reality (VR) interfaces is an influential medium to trigger emotional changes in humans. However, there is little research on making users of VR interfaces aware of their own and in collaborative interfaces, one another's emotional state.

    In this project, through a series of system development and user evaluations, we are investigating how physiological data such as heart rate, galvanic skin response, pupil dilation, and EEG can be used as a medium to communicate emotional states either to self (single user interfaces) or the collaborator (collaborative interfaces). The overarching goal is to make VR environments more empathetic and collaborators more aware of each other's emotional state.

  • 2018
  • Industrial Internet of Things (IoT) for Digital Aviation

    Industry 4.0 is based on real-time data collection, analysis, and management of smart factory operations through smart sensors, including wearable sensors on workers. This project explores the potential for wearable computing sensors and displays to contribute to both increased worker safety and improved production efficiency within smart factories.

  • 2019
  • LingoCube: Augmented Reality-Based Tangible User Interface for Interactive Language Learning

    Learning a second language is a useful exercise in many ways. However, the process of learning is not straightforward. This project explores novel methods of learning languages using augmented reality and tangible user interfaces.

  • Interacting in VR with Brain Signals and Facial Expression

    Virtual Reality is a technology that commonly requires handheld controllers and physical movements to properly interact with the environments. However, there are alternative interaction methods that can enable disabled users to interact with and experience VR such as brain-computer interaction, voice commands, and facial expression. This project explores these alternative interaction methods for different tasks in VR.

  • Virtual Reality Story Affordances

    Storytelling is an important domain in virtual reality, be it for creating awarenesses or simply for entertainment. However, experiences in virtual reality are subjective and based on the user's interaction with the environment. An important question is: how to ensure that the user experiences the story in the same way the storyteller wants to tell the story? This project investigates how different components of a virtual environment can be manipulated to align the experience of the user with the intention of the storyteller.

  • Lifelogging Thesis Projects

    Under the supervision of Dr Chelsea Dobbins, a number of thesis projects are underway that explore various aspects of lifelogging, including mobile and smartwatch app development.

  • Visualisation of ECG Data and Prediction of Heart Disease using Machine Learning

    This project focuses on the analysis and interpretation of the large amount of data generated by Electrocardiography (ECG) sensors. The results will be visualised and presented in a meaningful and understandable graphic. Finally, this project will explore the most appropriate machine learning techniques to predict the likelihood of heart disease. The goal of this project is to find a way to help users easily understand complex ECG data and make predictions about the onset of heart disease.