AI-generated images display patterns of sexist, racist, and class-based prejudices. Eva Gengler has published a study on this.
Here is the translation:
—
Her overview (on LinkedIn) of the results:
📍 **Gender Bias**:
â–ş Men are overrepresented in contexts of power and success.
â–ş Women are mainly depicted in beauty-related contexts, often sexualized.
â–ş In the context of power, older men and younger women are most commonly portrayed.
📍 **Racial Bias**:
â–ş People of color are underrepresented, particularly in positions of power and success.
â–ş White people dominate in all categories.
â–ş A slight increase in diversity is seen in beauty contexts.
📍 **Class-based Bias**:
â–ş The majority of depicted individuals come from privileged socioeconomic backgrounds with a wealthy appearance (business attire, elegance).
â–ş There is low representation of working-class people or “average” individuals (no model career, etc.).
✴ Eva Gengler presented on *”Feminist AI for a Fairer World”* at the herCAREER Expo 2024. “We have the great opportunity today to develop and use systems, which are being deployed everywhere, with the goal of changing power.”
✴ She was also one of the speakers at the discussion event *”Our Future of Work with Artificial Intelligence – How Women Can Benefit,”* presented by the Bavarian State Ministry for Family, Work, and Social Affairs. The discussion will soon be available as a 🎧 podcast (link in the comments)!
✴ Additionally, Eva served as a Table Captain at herCAREER@Night 😊
Thank you, dear Eva, for your active role in the herCAREER community on such exciting topics!
Published herCAREER,Â
Posted on LinkedIn on 13.11.2024