< Back to previous page

Project

Enhanced surgical 3D perception through autostereoscopic visualisation

This doctoral research is focused on the development of autostereoscopic 3D visualization for medical applications. The research will start with a rigorous requirement analysis, where tracking experiments will be conducted in the operating room to understand the current clinical practice, including movements of surgeons and their viewing behavior. Deep learning methods will be explored for eye- and gaze-tracking algorithms to extract the position of the user’s eyes. Given a single-viewer application, redundant estimation schemes will be developed to ensure robust tracking and user prioritization. Training setups in VR and synthetic in-silico environments (tissue is here replicated by synthetic material) will be designed. Furthermore, 3D AR guidance with autostereoscopy will be investigated to expand the application opportunities in the clinic. The developed technologies will be validated through experiments conducted with surgical trainers as well as in real clinical scenarios. 

Date:1 Feb 2020 →  Today
Keywords:Autostereoscopy, Image processing, Human-computer interaction
Disciplines:Image-guided interventions, Biomedical image processing, Mechatronics and robotics not elsewhere classified, Human-computer interaction
Project type:PhD project