Dr Arindam Dey

Lecturer Computing Digital Literacy

School of Information Technology and Electrical Engineering
Faculty of Engineering, Architecture and Information Technology
a.dey@uq.edu.au
+61 7 336 54537

Overview

**Currently recruiting Ph.D. students with interests and (some) experience in augmented reality and/or virtual reality. Unity 3D experience and excellent programming skill are required**

I am a Lecturer in the Co-innovation group of UQ's School of ITEE, primarily focusing on Mixed Reality, Empathic Computing, and Human-Computer Interaction. I am a proponent of "for good" research with these technologies and aiming to create positive societal impact with my research. I believe in designing solutions for users and accordingly put users ahead of the technology. Most of my work involves user research and statistics.

Before joining the University of Queensland in August 2018, I was a Research Fellow at the Empathic Computing Laboratory (UniSA) working with one of the world-leaders of Augmented Reality Prof. Mark Billinghurst between 2015 and 2018. Earlier, I held postdoctoral positions at the University of Tasmania, Worcester Polytechnic Institute (USA), and James Cook University. Earlier I completed my Ph.D. under the supervision of Prof. Christian Sandor and Prof. Bruce Thomas at the University of South Australia with a thesis titled Perceptual characteristics of visualizations for occluded objects in handheld augmented reality. During this time I did a research internship at the TU Munich under the supervision of Prof. Gudrun Klinker. I regularly serve as an organizer and a peer-reviewer of multiple international conferences and journals related to my research interests.

Originally, I was born in Kolkata, India (and lived there for 25 years) and now live in Brisbane, Australia with my wife and daughter! When not working, I enjoy spending time with my family and playing Cricket in the summer.

Research Interests

  • Mixed Reality (MR)
    MR is a technology continuum that covers both Augmented Reality (AR) and Virtual Reality (VR). I am interested in designing novel interfaces using both AR, VR, and a combination of both in collaborative applications. I believe MR being a user interface technology, we need to give utmost importance to the users to drive innovation. I am interested to supervise students and create collaborations in the space of user-centered MR.
  • Augmented Reality (AR) and Virtual Reality (VR)
    Both of these technologies are part of the MR continuum and my research interests. I am interested to use these technologies in multiple application domains including but not limited to remote collaborations, affective computing, gaming, education, health, and sustainability. I am a proponent of driving "for good" research using these technologies. If you are interested in one of these areas, I will be very happy to collaborate with and/or supervise.
  • Empathic Computing
    Empathy is about seeing with the eyes of another, listening with the ears of another and feeling with the heart of another. Empathic Computing is a research field that develops computer systems that recognize and share emotions and help people better understand one another. I am interested in creating and sharing empathy in computer-mediated collaboration (CSCW) or in single-user applications. This work involves working with physiological sensors and/or Electroencephalography (EEG). Please contact me if you are interested to explore this innovative space.
  • Human-Computer Interaction (HCI)
    While the above three areas are where I do most of my research, I am also interested in general HCI work including interaction design and visual perception. I will be happy to supervise students in other HCI related projects as well.

Qualifications

  • Doctor of Philosophy, S.Aust.

Available Projects

  • In this project, we will explore creating collaborative learning applications for high-school students using augmented reality. Using a user-centered design approach we will design, develop, and evaluate interactive educational content.

  • This project will design, develop, and evaluate cognitively adaptive interface (visualization and interaction methods) for learning using both augmented and virtual reality technologies. Physiological sensors, EEG, eye tracking information will be collected in real time and used to make the interface adaptive.

  • This project will design, develop, and evaluate visualization and interaction methods to share emotional cues between collaborators to make them collaborate seamlessly and understand one another emotionally. This project will involve using physiological sensors, EEG, and Unity 3D/Unreal gaming engine.

View all Available Projects

Publications

Book Chapter

  • Livingston, Mark A., Dey, Arindam, Sandor, Christian and Thomas, Bruce H. (2012). Pursuit of “X-ray vision” for augmented reality. In Weidong Huang, Leila Alem and Mark A. Livingston (Ed.), Human factors in augmented reality environments (pp. 67-107) New York, NY USA: Springer New York. doi:10.1007/978-1-4614-4205-9_4

Journal Article

Conference Publication

Possible Research Projects

Note for students: The possible research projects listed on this page may not be comprehensive or up to date. Always feel free to contact the staff for more information, and also with your own research ideas.

  • In this project, we will explore creating collaborative learning applications for high-school students using augmented reality. Using a user-centered design approach we will design, develop, and evaluate interactive educational content.

  • This project will design, develop, and evaluate cognitively adaptive interface (visualization and interaction methods) for learning using both augmented and virtual reality technologies. Physiological sensors, EEG, eye tracking information will be collected in real time and used to make the interface adaptive.

  • This project will design, develop, and evaluate visualization and interaction methods to share emotional cues between collaborators to make them collaborate seamlessly and understand one another emotionally. This project will involve using physiological sensors, EEG, and Unity 3D/Unreal gaming engine.