Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Medical Students' Perceptions of the Use of Generative Artificial Intelligence as an Assessment Strategy in Physiology
0
Zitationen
2
Autoren
2026
Jahr
Abstract
Generative artificial intelligence (AI) is increasingly embedded in higher education; however, its pedagogical role in assessment within medical education remains insufficiently characterized. This study analyzed medical students' perceptions of using generative AI as a formative assessment strategy in a physiology course. First-year medical students enrolled in <i>Physiology and Biophysics II</i> at a Brazilian medical school participated during the 2023 and 2024 academic years (n = 156). In 2023, the activity was completed individually, whereas in 2024 it was conducted in small groups. Students used ChatGPT to generate concise, topic-focused texts based on self-constructed prompts and subsequently evaluated and, when appropriate, revised the AI-generated content. Perceptions were assessed using a structured questionnaire followed by guided feedback discussions. Students consistently described the activity as challenging, engaging, and educationally valuable. Prompt construction emerged as the most demanding component, underscoring the importance of aligning prompts with learning objectives. Participants also emphasized the need for critical appraisal of AI-generated outputs, identifying conceptual gaps and inaccuracies that required correction. Group-based activities were perceived as particularly beneficial for discussion and collaborative learning. Overall, the findings highlight the importance of explicit instructional guidance when integrating generative AI into assessment, emphasizing prompt design, critical reasoning, ethical use, and learner autonomy. When thoughtfully implemented, generative AI may serve as a catalyst for meaningful learning in physiology education rather than a shortcut to answers.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.418 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.288 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.726 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.516 Zit.