AI in Healthcare Diagnosis Guidance for Responsible AI Governance Strategy

Artificial intelligence and machine learning are successfully being used in the diagnosis and management of treatment. 
AI in healthcare diagnosis - Responsible AI governance

Artificial intelligence and machine learning are successfully being used in the diagnosis and management of treatment. 

Key benefits

Healthcare diagnosis empowers doctors, nurses, patients, caregivers, pharmacists and others to make more informed decisions to deliver effective care.

  • Increased efficiency of the diagnostic process
  • Reduce overall costs and operation
  • Enhanced patient experience
  • Leveraging data and sharing of insights
  • Better prevention care

Real-world Use Case Examples

Volpara provides clinically validated, AI-powered software for personalized screening and early detection of breast cancer. Nearly 300,000 new breast cancer cases predicted among US women this year,

Responsible AI Governance Framework

Here is the guidance to help you develop business and a high-value use case for clinical decision support systems using Artificial Intelligence and Machine Learning.

This responsible AI governance framework guidance describes Esdha’s current research on the topic and should be viewed only as recommendations, unless specific regulatory or statutory requirements are cited.

Challenges & Opportunities


Data: AI relies on centralised, clinical data and real-time data sources leading to lack of inadequate supplies in hospital.

Operational Impact: Poor data quality can affect the quality of decision support provided.  

Transportability and interoperability: with the diversity of of clinical data sources,  system exists as stand-alone imposing greater challenges to implementation. Cloud infrastructure helps to reduce the interoperability issues. 

System monitoring & maintenance: Healthcare institutions have reported difficulty in monitoring and maintaining the knowledge base, algorithms, rules and data. 

Knowledge base: overall knowledge creation with the clear evidence base requires specialist input from various care professionals.

Interdisciplinary team: We need an interdisciplinary team consisting of computer scientists, patients, nurses, caregivers and clinicians to align goals, requirements and clinical trial outcomes.

Accountability: ‘who is accountable or morally and legally answerable’ to adverse outcomes. There is a need for frameworks on medical malpractice liability for AI CDSS.

Cost: due to lack of standardised metrics, it is hard to do cost benefit assessment as cost-effectiveness depends on a range of socio-economic factors including environment, political and technological.

Potential risks & mitigation

Trustworthiness: different stakeholders have distinctive expectations which needs adequate risk-benefit analysis for building rules and outcome measures.

Wrong or misleading recommendation: can result in loss of trust or serious consequences. 

Privacy & quality: adherence to data protection and privacy requirements26 27 such as the general data protection regulation (GDPR) will be essential. A standardised approach to data collection can help to address this risk.

Bias, overfitting and validity: build a rigorous criterion to evaluate for biases (such as statistical misrepresentation to the general population), overfitting, and validity.

Share:

More Posts

Modernising Applications

Overcoming Obstacles: Why Application Modernisation Matters

In today’s fast-paced digital landscape, legacy systems can often hinder #business growth and impede essential processes. Recognising this tipping point, leaders are turning to application modernisation to overcome these obstacles and propel their organisations forward.

Send Us A Message