Generative AI and Stereotypes


Creatives have always needed to understand the strengths and weaknesses of our tools and how they can shape their craft. As more creatives explore generative AI, we must be mindful of the bias and averaging that can appear in our work due to the data used by these tools.


Rest of World published an excellent article about how generative AI tools propagate stereotypes. They analyzed 3,000 AI images to see how image generators visualize different countries and cultures. Key takeaways include the following insights below. Take a closer look at the article at restofworld.org.


Results from Midjourney show stereotypical bias

The need for data transparency.
There is a need for AI companies to become more transparent with the data that they use.

Bias exists in generative AI algorithms.
Bias occurs in many algorithms and AI systems — from sexist and racist search results to facial recognition systems that perform worse on Black faces. Generative AI systems are no different. In an analysis of more than 5,000 AI images, Bloomberg found that images associated with higher-paying job titles featured people with lighter skin tones and that results for most professional roles were male-dominated.

Generative AI has an upside for marginalized groups but also carries risks.
Generative AI could help improve diversity in media by making creative tools more accessible to marginalized groups or those lacking the resources to produce messages at scale. But used unwisely, it risks silencing those same groups.

Associative aspects of generative AI can lead to overly average results with bias.
" These models are purely associative machines,” Pruthi said. He gave the example of a football: An AI system may learn to associate footballs with a green field and so produce images of footballs on grass. In many cases, this results in a more accurate or relevant picture. But you're out of luck if you don’t want an “average” image. “It’s kind of the reason why these systems are so good, but also their Achilles’ heel,” - Sasha Luccioni, researcher in ethical and sustainable AI at Hugging Face.


AI, VisualizationDanny Stillion