14 Jul Mixed and Augmented Reality Can Facilitate Seamless Medical Communication
MedicalResearch.com Interview with:
Dr Ian Williams PhD
Associate Professor, DMT Lab
Birmingham City University
Faculty of Computing, Engineering and the Built Environment
Centre for Digital Media Technology
Millennium Point
Birmingham UK
MedicalResearch.com: What is the background for your work?
Response: Our work at the DMT Lab (dmtlab.bcu.ac.uk) focuses on developing a novel Mixed Reality (MR) medical presentation platform which allows practitioners to interact with patient data and virtual anatomical models in real time. The system enables the presentation of medical data, models and procedures to patients with the aim of educating them on pending procedures or the effects of lifestyle choices (for example the effects of smoking or excessive alcohol consumption).
The system employs an exocentric mixed reality environment which can be deployed in any room. It integrates a medical practitioner in real time with multimodal patient data and the corresponding result is a real time co-located visualisation of both the practitioner and the data, which they can interact with in real time. We implement a natural interaction method into the system which improves a user’s level of direct interaction with the virtual models and provides a more realistic control of the data.
The system can also be used in a fun educational setting where patients, students, children or any naive user, can learn about medical anatomical information via a real-time interactive mixed reality “body scanner”. This fun system overlays the MR information onto their own body in real-time and shows them scaled and interactive virtual organs, anatomy and corresponding medical information. We are aiming for this system to be used not only in patient education but also in engaging and informing people on lifestyle choices.
MedicalResearch.com: Can you briefly describe what is mean by mixed and augmented reality?
Response: Mixed and Augmented reality (MR and AR) systems allow a real scene to be enhanced, extended or augmented with virtual objects or information. These systems are often deployed on wearable devices such as the Microsoft Hololens or in exocentric environments, for example immersive cave or projection rooms. Irrespective of the hardware, the aim of MR and AR systems is to facilitate the seamless integration of digital information and real information with the aim to augment and enhance the real world.
MedicalResearch.com: What types of medical or surgical problems do you envision can be enhanced with the use of free hand gestures to manipulate patient data?
Response: Mixed reality has enormous potential within the medical field, with healthcare being profoundly affected by some recent developments. Mixed reality technology can also provide the platform for facilitating a seamless doctor-patient communications in real time. The system we are developing can provide a real time augmented view of the patient’s data which can be overlaid onto the patient, or interacted with via freehand interaction without the use of complex wearable devices.
Many current mixed reality systems rely on bespoke sensors and cumbersome wearable devices (for example haptic gloves) whereas we work in freehand interaction without the need for a medical practitioner or patient to wear any complex wearable device. This interaction method enables a more natural virtual interface and via the use of naturally inspired physical interaction models (for example common real grasping types) we bridge the gap between users and technology. This form of natural interaction can also enable an interaction which can be perceived as more realistic to the observer.
At present we are employing methods to use the system to demonstrate pre and post joint replacement surgery. This will therefore allow a medical practitioner to demonstrate both the pre and post-surgery range of motion, using interactive virtual models directly in front of the patient. The patient themselves can also interact with the system (if they wish) and therefore see the potential results and benefits they may have post-surgery.
MedicalResearch.com: What should readers take away from your report?
Response: Our work is leading in developing these novel interactive healthcare applications for mixed and augmented reality. We are pushing to enable a more natural form of interaction with the virtual models and digital patient data and developing more comprehensive natural interactive models. In addition we are looking to extend the application of the system, notably in the wider medical simulation domain.
MedicalResearch.com: What recommendations do you have for future research as a result of this study?
Response: At present we expanding the functionality of the system to include the integration of additional hardware sensors for incorporating realtime patient data. We are also developing the system to interface with current hardware in novel medical applications (for example using the Microsoft Hololens). We are also developing methods for integrating patient specific data directly with the virtual models. This includes overlaying and visualising results from MRI and CT scans directly onto to the patient and also onto the virtual models themselves. Eventually we propose that the system will deliver a unique and personalised interactive patient experience, helping practitioners to engage more with their patients and enhance their interface to medical information.
MedicalResearch.com: Thank you for your contribution to the MedicalResearch.com community.
Note: Content is Not intended as medical advice. Please consult your health care provider regarding your specific medical condition and questions.
Last Updated on July 14, 2017 by Marie Benz MD FAAD