Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
ChatGPT for Internet of Things Security: Capabilities, Risks, and Research Priorities
0
Zitationen
3
Autoren
2025
Jahr
Abstract
As Internet of Things (IoT) adoption accelerates, security risks pose major threats given the scale of vulnerable connected devices now embedded across critical infrastructure and environments. Conversational Artificial Intelligence (AI) systems like ChatGPT introduce new opportunities to enhance IoT security through natural language automation of tedious processes like policy generation, vulnerability management, and education at unprecedented scale. However, unrestrained application also poses significant risks of misuse, over-reliance, and unvetted guidance. This paper provides a comprehensive analysis of ChatGPT's emerging dual-edged implications for IoT security. We highlight a range of promising use cases where ChatGPT could automate security workflows through natural conversation as well as major risks of misuse for reconnaissance and exploit generation if applied irresponsibly. A set of key tradeoffs in effectively governing ChatGPT is discussed. The paper argues for responsible constraints and human-centered workflows to maximize upside while mitigating downside risks as ChatGPT capabilities evolve. Finally, recommendations for future research are provided to guide informed adoption.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.402 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.270 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.702 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.507 Zit.