Supervisory Team

Background

Echocardiography’s diagnostic reliability hinges on image quality, which remains highly operator-dependent. While AI guidance systems can assist image acquisition, existing UIs are often static, non-personalized, and lack explainability—limiting trust and long-term skill development.

Research Questions

  • How can adaptive user interfaces improve AI guidance for users of varying expertise?

  • What explainable AI (XAI) features foster trust, learning, and autonomy in clinical workflows?

  • How do adaptive and explainable UIs compare with conventional static systems in usability, performance, and user experience?

Aim

  • Design, prototype, and evaluate adaptive and explainable UIs for AI-based real-time image quality feedback in echocardiography.

Objectives

  • User-Centered Requirements

    • Define user requirements through interviews with clinicians across skill levels.
  • Prototype Development

    • Design interface prototypes integrating adaptive logic and explainable feedback.
  • Evaluation

    • Evaluate performance and usability through controlled simulation-based experiments.

Methodology

  • Year 1: Requirements gathering via literature review and clinician interviews.

  • Year 2: Develop interactive UI prototypes with adaptive/XAI features; conduct formative usability tests.

  • Year 3: Run a comparative evaluation using a high-fidelity echo simulator, with novice to expert users.

  • Analyze trust, task performance, skill perception, and usability using mixed methods.

Clinical Partners

  • Imperial College London – National Heart and Lung Institute

Contact

Professor Massoud Zolgharni