Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
AI-driven mental health decision support linked to clinician resilience and preparedness
0
Zitationen
5
Autoren
2026
Jahr
Abstract
Objectives Mental health services are facing unprecedented demand, placing significant pressure on clinicians to conduct timely and effective patient assessments. Rising staff turnover and burnout threatens service quality across many countries. This study examined whether providing clinical information, collected via an artificial intelligence (AI)—enabled decision support tool for mental health assessments in the UK's National Health Service (NHS), was associated with differences in clinician wellbeing and patient assessment performance. Method In this observational study, we surveyed mental health clinicians ( N = 131) from nine NHS Mental Health Talking Therapies services on how the information provided by an AI-based decision-support tool related to their experience with conducting clinical assessments. Clinicians reported on assessments where information from the AI tool was available, as well as when it was not (e.g., general practitioner referrals or telephone intakes). Outcomes included clinician wellbeing, task performance, and cognitive load during assessments, with additional analyses assessing the influence of moderating factors, such as clinician experience, workload, and exposure to the tool. Results Relative to traditional methods, assessments supported by information provided by the AI tool were associated with significantly higher clinician wellbeing and task performance, and significantly lower cognitive load, irrespective of the clinician's experience. These associations were magnified by workload. Conclusion These findings provide preliminary evidence that AI-powered pre-assessment tools may be associated with differences in clinician experience including higher wellbeing, higher task performance, and lower cognitive burden. By targeting systemic drivers of burnout, such tools may represent a potentially scalable approach to support workforce sustainability and service quality in mental health care.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.534 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.423 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.917 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.582 Zit.