Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Translating multimodal foundation models into oncology: Toward a future where AI directs diagnosis and therapy
0
Zitationen
2
Autoren
2025
Jahr
Abstract
Translating multimodal foundation models into oncology: Toward a future where AI directs diagnosis and therapyRecent developments in multimodal artificial intelligence (AI) have begun to transform how clinicians approach cancer prognosis and treatment selection.In a recent study, Xiang et al 1 present MUSK (Multimodal Unified Self-supervised learning for Oncology), a foundation model that integrates more than 50 million whole-slide pathology images and over 1 billion oncology-related clinical text tokens.MUSK uses a unified transformer architecture to simultaneously capture morphological and semantic features, enabling the integrated image-text interpretation essential for oncology (Fig. 1A).The model was pretrained in two stages: the first stage employed masked modeling using unpaired data from each modality, while the second stage used approximately one million paired image-text samples with contrastive learning to align histologic and linguistic representations.This approach enabled robust cross-modal understanding, supporting downstream diagnostic and prognostic applications.In comparative evaluations across several cancer types, including non-small cell lung cancer, colorectal carcinoma, gastric adenocarcinoma, and melanoma, MUSK outperformed traditional staging systems and biomarker-based models.In non-small cell lung cancer, the model demonstrated refined stratification of survival risk within the same pathological stage, capturing histologic subtleties and semantic modifiers within pathology reports that may be overlooked by standard criteria.For immunotherapy response prediction, MUSK achieved an AUC of 0.77 versus 0.61 for PD-L1-based classifiers.In patients with melanoma, recurrence risk prediction reached an accuracy of 83%, with the model prioritizing regions of mitotic activity, tumor regression, and lymphovascular infiltration.Interpretability was central to the framework, with visualization tools highlighting key histologic regions and textual descriptors.These interpretive tools may aid pathologists and
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.422 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.300 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.734 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.519 Zit.