Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Small Language Models: Architecture, Evolution, and the Future of Artificial Intelligence
0
Zitationen
5
Autoren
2026
Jahr
Abstract
Large language models (LLMs) have significantly advanced artificial intelligence, yet their high com-putational, energy, and privacy costs pose substantial challenges. In contrast, Small Language Models(SLMs), typically with fewer than 15 billion parameters, have emerged as efficient alternatives. Thissurvey provides a comprehensive analysis of the SLM landscape, tracing their evolution and examiningarchitectural innovations that enhance efficiency. A novel multi-axis taxonomy is introduced to classifySLMs by genesis, architecture, and optimization goals, offering a structured framework for this field.Performance benchmarks are reviewed exhaustively, demonstrating that while LLMs excel in broadknowledge tasks, state-of-the-art SLMs match or exceed larger models in domains such as mathematicalreasoning and code generation. The analysis concludes that the future of AI lies in hybrid ecosystems,where specialized SLMs manage most tasks locally, escalating complex queries to cloud-based LLMs.This tiered approach promises scalability, privacy, and the democratization of AI.
Ähnliche Arbeiten
Federated Learning: Challenges, Methods, and Future Directions
2020 · 4.427 Zit.
Deep Learning: Methods and Applications
2014 · 3.314 Zit.
Mobile Edge Computing: A Survey on Architecture and Computation Offloading
2017 · 2.908 Zit.
Machine Learning: An Artificial Intelligence Approach
2013 · 2.639 Zit.
Machine learning and deep learning
2021 · 2.358 Zit.