Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Unlocking the potential of qualitative research for the implementation of artificial intelligence-enabled healthcare
6
Zitationen
5
Autoren
2023
Jahr
Abstract
<p> Artificial intelligence (AI)-enabled clinical decision support tools (CDSTs) are complicated technologies, which form the basis of complex AI-enabled healthcare interventions. Research of AI-enabled CDSTs has proliferated, with 57,844 model development studies and 5,073 comparative or real-world evaluation studies readily identifiable on PubMed at the time of writing (<a href="https://jmai.amegroups.org/article/view/7963/html#B1" target="_blank">1</a>). Despite this proliferation of evidence, a notable translational gap persists with little real-world implementation of AI-enabled healthcare interventions (<a href="https://jmai.amegroups.org/article/view/7963/html#B2" target="_blank">2</a>). While research communities have acknowledged the value and importance of studying AI implementation in real-world clinical settings, there is limited evidence on how to translate the potential of AI into everyday healthcare practices. This persistent translational failure is multifactorial, but there is clear opportunity for impact from the research community if they can deliver the evidence that healthcare systems’ decision makers need to fully evaluate complex interventions such as those involving AI-enabled CDSTs (<a href="https://jmai.amegroups.org/article/view/7963/html#B2" target="_blank">2</a>). This need for a holistic evidence base exists because AI-enabled CDSTs cannot be considered as inert and isolated technologies, but as components of a complex system which shape and are shaped by the adopters and organisations which enable their impact. The complexity surrounding the clinical implementation of AI tools and applications requires therefore to better understand the interplay between agency, social processes, and contextual conditions shaping implementation. Qualitative research provides a valuable approach to study AI implementation because it allows research communities to explore the interplay between social processes and contextual factors shaping the implementation of change (<a href="https://jmai.amegroups.org/article/view/7963/html#B3" target="_blank">3</a>). Qualitative research can also surface how these factors may be anticipated or modified to support judicious and successful implementation efforts across varied sociotechnical contexts. In so doing, it helps to answer complex questions such as how and why efforts to implement best practices may succeed or fail, and how patients and providers experience and make decisions in care (<a href="https://jmai.amegroups.org/article/view/7963/html#B4" target="_blank">4</a>). </p>
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.422 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.300 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.734 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.519 Zit.
Autoren
Institutionen
- Newcastle upon Tyne Hospitals NHS Foundation Trust(GB)
- Newcastle University(GB)
- University College London(GB)
- Moorfields Eye Hospital(GB)
- Moorfields Eye Hospital NHS Foundation Trust(GB)
- Duke Institute for Health Innovation(US)
- University Hospitals Birmingham NHS Foundation Trust(GB)
- National Institute for Health Research(GB)
- University of Birmingham(GB)
- NIHR Birmingham Biomedical Research Centre(GB)
- NIHR Birmingham Liver Biomedical Research Unit(GB)
- University of Leicester(GB)