How AI Image Generation Amplifies Racial Stereotypes

How AI Image Generation Amplifies Racial Stereotypes

In this November 2023 article for the Washington Post, authors Nitasha Tiku, Kevin Schaul, and Szu Yu Chen tackle the complex world of AI image generation, showing the world through a “Western lens” how artificial intelligence image generators like Stable Diffusion and DALL-E amplify problematic stereotypes related to gender, race, and culture. Despite efforts to reduce bias, these AI models still depict Asian women as oversexualized, Africans as primitive, Europeans as worldly, and leaders as men, reflecting the data they are trained on.

The article argues that as synthetic images proliferate online, they could reinforce outdated stereotypes and encode biased ideals around body type, gender, and race into the future of image-making.

Editor’s Note: The troubling biases and stereotypes perpetuated by AI image generators underscore the urgent need for a fundamental rethinking of how we approach the development and deployment of these powerful technologies. Rather than simply tweaking datasets or adding disclaimers, we must grapple with the more profound philosophical and ethical questions at the heart of artificial intelligence: whose values and worldviews are encoded into these systems, and how can we ensure that they serve to liberate and empower all people, regardless of their background or identity?

We first reported about AI bias as early as 2018 [see SOCIETY GENDER BIAS COULD WORSEN WITH AI, AI IS BIASED AGAINST THE POOR AND PEOPLE OF COLOR. HOW CAN AI EXPERTS ADDRESS THIS?, THE BIASES THAT CAN BE FOUND IN AI ALGORITHM]. It has been six years, but AI developers have been unable to fix the problem. This then brings us to the question: will they ever find the solution?

Read Original Article

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

Your e-mail address is only used to send you our newsletter and information about the activities of Fully Human.

Leave a Reply

Your email address will not be published. Required fields are marked *