Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
NonbAInary: How Does AI Depict Gender?
0
Zitationen
3
Autoren
2026
Jahr
Abstract
Most fairness audits of gender bias in visual AI remain limited to binary analyses and dismiss nonbinary genders. In this paper, we present "NonbAInary", an interactive installation that proposes a nonbinary approach to unveil gender bias. The installation features the Gender Diamond, a data-driven visualisation of the gender spectrum, operationalised through coexisting femininity and masculinity scales. Through the display of a series of AI-generated gendered portraits and a critical interface, visitors are invited to reflect on gender bias as a tension between accuracy — how closely a portrait aligns with one’s internalised gender stereotypes — and discrimination — whether such portraits are perceived as ethically harmful. By combining critical data practices and queer human–AI interaction, the installation reframes AI auditing as a critical, reflexive, and collective process.
Ähnliche Arbeiten
The global landscape of AI ethics guidelines
2019 · 4.725 Zit.
The Limitations of Deep Learning in Adversarial Settings
2016 · 3.886 Zit.
Trust in Automation: Designing for Appropriate Reliance
2004 · 3.512 Zit.
Fairness through awareness
2012 · 3.302 Zit.
AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, and Recommendations
2018 · 3.202 Zit.