Arindam Dey

Arindam Dey

Director

I am a Lecturer in the Co-innovation group of UQ’s School of ITEE, primarily focusing on Extended Reality, Empathic Computing, and Human-Computer Interaction. I am a proponent of “for good” research with these technologies and aiming to create positive societal impact with my research. I believe in designing solutions for users and accordingly put users ahead of the technology. Most of my work involves user research and statistics.

Before joining the University of Queensland in August 2018, I was a Research Fellow at the Empathic Computing Laboratory (UniSA) working with one of the world-leaders of Augmented Reality Prof. Mark Billinghurst between 2015 and 2018. Earlier, I held postdoctoral positions at the University of Tasmania, Worcester Polytechnic Institute (USA), and James Cook University. Earlier I completed my Ph.D. under the supervision of Prof. Christian Sandor and Prof. Bruce Thomas at the University of South Australia with a thesis titled Perceptual characteristics of visualizations for occluded objects in handheld augmented reality. During this time I did a research internship at the TU Munich under the supervision of Prof. Gudrun Klinker. I regularly serve as an organizer and a peer-reviewer of multiple international conferences and journals related to my research interests.

Originally, I was born in Kolkata, India (and lived there for 25 years) and now live in Brisbane, Australia with my wife and daughter! When not working, I enjoy spending time with my family and playing Cricket in the summer.

Please visit my webpage on UQ Researchers.

Projects

  • Biofeedback in Virtual Reality

    Virtual reality (VR) interfaces is an influential medium to trigger emotional changes in humans. However, there is little research on making users of VR interfaces aware of their own and in collaborative interfaces, one another's emotional state. In this project, through a series of system development and user evaluations, we are investigating how physiological data such as heart rate, galvanic skin response, pupil dilation, and EEG can be used as a medium to communicate emotional states either to self (single user interfaces) or the collaborator (collaborative interfaces). The overarching goal is to make VR environments more empathetic and collaborators more aware of each other's emotional state.

  • Virtual Reality Story Affordances

    Storytelling is an important domain in virtual reality, be it for creating awarenesses or simply for entertainment. However, experiences in virtual reality are subjective and based on the user's interaction with the environment. An important question is: how to ensure that the user experiences the story in the same way the storyteller wants to tell the story? This project investigates how different components of a virtual environment can be manipulated to align the experience of the user with the intention of the storyteller.

  • Interacting in VR with Brain Signals and Facial Expression

    Virtual Reality is a technology that commonly requires handheld controllers and physical movements to properly interact with the environments. However, there are alternative interaction methods that can enable disabled users to interact with and experience VR such as brain-computer interaction, voice commands, and facial expression. This project explores these alternative interaction methods for different tasks in VR.

  • LingoCube: Augmented Reality-Based Tangible User Interface for Interactive Language Learning

    Learning a second language is a useful exercise in many ways. However, the process of learning is not straightforward. This project explores novel methods of learning languages using augmented reality and tangible user interfaces.

  • Adaptive Virtual Interfaces

    This project explores the possibilities of making virtual reality interfaces more effective by making them adaptive to the users' emotional and cognitive needs.

Publications

  • A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014
    Arindam Dey, Mark Billinghurst, Robert W Lindeman, J Swan

    Dey A, Billinghurst M, Lindeman RW and Swan JE II (2018) A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Front. Robot. AI 5:37. doi: 10.3389/frobt.2018.00037

    @ARTICLE{10.3389/frobt.2018.00037,
    AUTHOR={Dey, Arindam and Billinghurst, Mark and Lindeman, Robert W. and Swan, J. Edward},
    TITLE={A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014},
    JOURNAL={Frontiers in Robotics and AI},
    VOLUME={5},
    PAGES={37},
    YEAR={2018},
    URL={https://www.frontiersin.org/article/10.3389/frobt.2018.00037},
    DOI={10.3389/frobt.2018.00037},
    ISSN={2296-9144},
    }
    Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review 10 years of the most influential AR user studies, from 2005 to 2014. A total of 291 papers with 369 individual user studies have been reviewed and classified based on their application areas. The primary contribution of the review is to present the broad landscape of user-based AR research, and to provide a high-level view of how that landscape has changed. We summarize the high-level contributions from each category of papers, and present examples of the most influential user studies. We also identify areas where there have been few user studies, and opportunities for future research. Among other things, we find that there is a growing trend toward handheld AR user studies, and that most studies are conducted in laboratory settings and do not involve pilot testing. This research will be useful for AR researchers who want to follow best practices in designing their own AR user studies.
  • He who hesitates is lost (... in thoughts over a robot)
    James Wen, Amanda Stewart, Mark Billinghurst, Arindam Dey, Chad Tossell, Victor Finomore

    James Wen, Amanda Stewart, Mark Billinghurst, Arindam Dey, Chad Tossell, and Victor Finomore. 2018. He who hesitates is lost (...in thoughts over a robot). In Proceedings of the Technology, Mind, and Society (TechMindSociety '18). ACM, New York, NY, USA, Article 43, 6 pages. DOI: https://doi.org/10.1145/3183654.3183703

    @inproceedings{Wen:2018:HHL:3183654.3183703,
    author = {Wen, James and Stewart, Amanda and Billinghurst, Mark and Dey, Arindam and Tossell, Chad and Finomore, Victor},
    title = {He Who Hesitates is Lost (...In Thoughts over a Robot)},
    booktitle = {Proceedings of the Technology, Mind, and Society},
    series = {TechMindSociety '18},
    year = {2018},
    isbn = {978-1-4503-5420-2},
    location = {Washington, DC, USA},
    pages = {43:1--43:6},
    articleno = {43},
    numpages = {6},
    url = {http://doi.acm.org/10.1145/3183654.3183703},
    doi = {10.1145/3183654.3183703},
    acmid = {3183703},
    publisher = {ACM},
    address = {New York, NY, USA},
    keywords = {Anthropomorphism, Empathy, Human Machine Team, Robotics, User Study},
    }
    In a team, the strong bonds that can form between teammates are often seen as critical for reaching peak performance. This perspective may need to be reconsidered, however, if some team members are autonomous robots since establishing bonds with fundamentally inanimate and expendable objects may prove counterproductive. Previous work has measured empathic responses towards robots as singular events at the conclusion of experimental sessions. As relationships extend over long periods of time, sustained empathic behavior towards robots would be of interest. In order to measure user actions that may vary over time and are affected by empathy towards a robot teammate, we created the TEAMMATE simulation system. Our findings suggest that inducing empathy through a back story narrative can significantly change participant decisions in actions that may have consequences for a robot companion over time. The results of our study can have strong implications for the overall performance of human machine teams.
  • The Effects of Sharing Awareness Cues in Collaborative Mixed Reality
    T Piumsomboon, A Dey, B Ens, G Lee, M Billinghurst

    Piumsomboon, T., Dey, A., Ens, B., Lee, G. and Billinghurst, M., 2019. The Effects of Sharing Awareness Cues in Collaborative Mixed Reality. Frontiers in Robotics and AI, 6, p.5.

    @ARTICLE{10.3389/frobt.2019.00005,

    AUTHOR={Piumsomboon, Thammathip and Dey, Arindam and Ens, Barrett and Lee, Gun and Billinghurst, Mark},

    TITLE={The Effects of Sharing Awareness Cues in Collaborative Mixed Reality},

    JOURNAL={Frontiers in Robotics and AI},

    VOLUME={6},

    PAGES={5},

    YEAR={2019},

    URL={https://www.frontiersin.org/article/10.3389/frobt.2019.00005},

    DOI={10.3389/frobt.2019.00005},

    ISSN={2296-9144},

    ABSTRACT={Augmented and Virtual Reality provide unique capabilities for Mixed Reality collaboration. This paper explores how different combinations of virtual awareness cues can provide users with valuable information about their collaborator’s attention and actions. In a user study (n=32, 16 pairs), we compared different combinations of three cues: Field-of-View (FoV) frustum, Eye-gaze ray, and Head-gaze ray against a baseline condition showing only virtual representations of each collaborator’s head and hands. Through a collaborative object finding and placing task, the results showed that awareness cues significantly improved user performance, usability, and subjective preferences, with the combination of the FoV frustum and the Head-gaze ray being best. This work establishes the feasibility of room-scale MR collaboration and the utility of providing virtual awareness cues.}
    }
    Augmented and Virtual Reality provide unique capabilities for Mixed Reality collaboration. This paper explores how different combinations of virtual awareness cues can provide users with valuable information about their collaborator’s attention and actions. In a user study (n=32, 16 pairs), we compared different combinations of three cues: Field-of-View (FoV) frustum, Eye-gaze ray, and Head-gaze ray against a baseline condition showing only virtual representations of each collaborator’s head and hands. Through a collaborative object finding and placing task, the results showed that awareness cues significantly improved user performance, usability, and subjective preferences, with the combination of the FoV frustum and the Head-gaze ray being best. This work establishes the feasibility of room-scale MR collaboration and the utility of providing virtual awareness cues.
  • Effects of Manipulating Physiological Feedback in Immersive Virtual Environments
    Arindam Dey, Hao Chen, Mark Billinghurst, Robert W Lindeman

    Dey, A., Chen, H., Billinghurst, M. and Lindeman, R.W., 2018, October. Effects of Manipulating Physiological Feedback in Immersive Virtual Environments. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play (pp. 101-111). ACM.

    @inproceedings{dey2018effects,
    title={Effects of Manipulating Physiological Feedback in Immersive Virtual Environments},
    author={Dey, Arindam and Chen, Hao and Billinghurst, Mark and Lindeman, Robert W},
    booktitle={Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play},
    pages={101--111},
    year={2018},
    organization={ACM}
    }
    Virtual environments have been proven to be effective in evoking emotions. Earlier research has found that physiological data is a valid measurement of the emotional state of the user. Being able to see one's physiological feedback in a virtual environment has proven to make the application more enjoyable. In this paper, we have investigated the effects of manipulating heart rate feedback provided to the participants in a single user immersive virtual environment. Our results show that providing slightly faster or slower real-time heart rate feedback can alter participants' emotions more than providing unmodified feedback. However, altering the feedback does not alter real physiological signals.
  • Effects of Sharing Real-Time Multi-Sensory Heart Rate Feedback in Different Immersive Collaborative Virtual Environments
    Arindam Dey, Hao Chen, Chang Zhuang, Mark Billinghurst, Robert W Lindeman

    Dey, A., Chen, H., Zhuang, C., Billinghurst, M. and Lindeman, R.W., 2018, October. Effects of Sharing Real-Time Multi-Sensory Heart Rate Feedback in Different Immersive Collaborative Virtual Environments. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 165-173). IEEE.

    @INPROCEEDINGS{8613762,
    author={A. {Dey} and H. {Chen} and C. {Zhuang} and M. {Billinghurst} and R. W. {Lindeman}},
    booktitle={2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
    title={Effects of Sharing Real-Time Multi-Sensory Heart Rate Feedback in Different Immersive Collaborative Virtual Environments},
    year={2018},
    volume={},
    number={},
    pages={165-173},
    keywords={feedback;groupware;virtual reality;immersive collaborative virtual environments;real-time multisensory heart rate feedback;collaborative VR environments;real-time heart rate feedback participants;providing heart rate feedback;single user environments;providing physiological feedback;Heart rate;Collaboration;Real-time systems;Physiology;Avatars;Task analysis;Virtual environments},
    doi={10.1109/ISMAR.2018.00052},
    ISSN={1554-7868},
    month={Oct},}
    Collaboration is an important application area for virtual reality (VR). However, unlike in the real world, collaboration in VR misses important empathetic cues that can make collaborators aware of each other's emotional states. Providing physiological feedback, such as heart rate or respiration rate, to users in VR has been shown to create a positive impact in single user environments. In this paper, through a rigorous mixed-factorial user experiment, we evaluated how providing heart rate feedback to collaborators influences their collaboration in three different environments requiring different kinds of collaboration. We have found that when provided with real-time heart rate feedback participants felt the presence of the collaborator more and felt that they understood their collaborator's emotional state more. Heart rate feedback also made participants feel more dominant when performing the task. We discuss the implication of this research for collaborative VR environments, provide design guidelines, and directions for future research.
  • Exploration of an EEG-Based Cognitively Adaptive Training System in Virtual Reality
    Arindam Dey, Alex Chatburn, Mark Billinghurst

    A. Dey, A. Chatburn and M. Billinghurst, "Exploration of an EEG-Based Cognitively Adaptive Training System in Virtual Reality," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 2019, pp. 220-226.

    @INPROCEEDINGS{8797840,
    author={A. {Dey} and A. {Chatburn} and M. {Billinghurst}},
    booktitle={2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)},
    title={Exploration of an EEG-Based Cognitively Adaptive Training System in Virtual Reality},
    year={2019},
    volume={},
    number={},
    pages={220-226},
    keywords={Virtual Reality;Cognitively Adaptive Training;Electroencephalography;Alpha Activity;H.1.2 [Models and Principles]: User/Machine Systems-Human Factors;H.5.1 [Multimedia Information Systems]: Artificial-Augmented and Virtual Realities},
    doi={10.1109/VR.2019.8797840},
    ISSN={2642-5254},
    month={March},}
    Virtual Reality (VR) is effective in various training scenarios across multiple domains, such as education, health and defense. However, most of those applications are not adaptive to the real-time cognitive or subjectively experienced load placed on the trainee. In this paper, we explore a cognitively adaptive training system based on real-time measurement of task related alpha activity in the brain. This measurement was made by a 32-channel mobile Electroencephalography (EEG) system, and was used to adapt the task difficulty to an ideal level which challenged our participants, and thus theoretically induces the best level of performance gains as a result of training. Our system required participants to select target objects in VR and the complexity of the task adapted to the alpha activity in the brain. A total of 14 participants undertook our training and completed 20 levels of increasing complexity. Our study identified significant differences in brain activity in response to increasing levels of task complexity, but response time did not alter as a function of task difficulty. Collectively, we interpret this to indicate the brain's ability to compensate for higher task load without affecting behaviourally measured visuomotor performance