AI-Enhanced home-based Optical Coherence Tomography for Eye Care
Collaborators
Supervisory Team
Project Description
Age-related Macular Degeneration (AMD) is a leading cause of vision loss in individuals over 60 and accounts for more than 50% of blindness registrations in the UK. Globally, 196 million people were affected by AMD in 2020, with numbers expected to rise to 288 million by 2040 due to aging populations. Regular monitoring and treatment, such as intraocular injections, are essential to slow disease progression. However, the current model—frequent in-person hospital visits and manual interpretation of Optical Coherence Tomography (OCT) scans—is resource-intensive and places a considerable burden on elderly patients and healthcare systems.
This project proposes the development of an AI-enhanced home-based OCT solution. This research aims to create a portable, wearable device capable of acquiring retinal images at home. These images will be analysed using advanced AI models to detect disease progression, enabling timely clinical intervention while reducing the need for frequent hospital visits.
Research Goals
-
Develop robust AI models capable of detecting subtle changes in retinal images captured from home-based OCT devices.
-
Enhance image quality and diagnostic reliability through data standardisation, noise reduction, and deep learning-based image enhancement.
-
Improve clinical trust by integrating explainable AI (XAI) methods that provide interpretable insights into AI-driven decisions.
-
Validate the AI framework using real-world datasets and clinical collaboration to assess accuracy, usability, and regulatory readiness.
Candidate Profile
-
Background: Experience with computer vision and deep learning frameworks such as Python, PyTorch, or TensorFlow.
-
Skills: Knowledge of image processing techniques, particularly in medical imaging and low-quality data enhancement.
-
Clinical Interest: Familiarity with clinical data, biomedical signal analysis, and interest in ophthalmology or healthcare AI.
-
Ethics & Explainability: Interest in explainable AI and ethical aspects of medical technology, particularly regarding patient safety and clinical adoption.