Meta’s AI image generator struggles to create images of couples of different races

Meta AI is consistently unable to generate accurate images for seemingly simple prompts like “Asian man and Caucasian friend,” or “Asian man and white wife,” The Verge reports. Instead, the company’s image generator seems to be biased toward creating images of people of the same race, even when explicitly prompted otherwise.

Engadget confirmed these results in our own testing of Meta’s web-based image generator. Prompts for “an Asian man with a white woman friend” or “an Asian man with a white wife” generated images of Asian couples. When asked for “a diverse group of people,” Meta AI generated a grid of nine white faces and one person of color. There were a couple occasions when it created a single result that reflected the prompt, but in most cases it failed to accurately depict the prompt.

As The Verge points out, there are other more “subtle” signs of bias in Meta AI, like a tendency to make Asian men appear older while Asian women appeared younger. The image generator also sometimes added “culturally specific attire” even when that wasn’t part of the prompt.

It’s not clear why Meta AI is struggling with these types of prompts, though it’s not the first generative AI platform to come under scrutiny for its depiction of race. Google’s Gemini image generator paused its ability to create images of people after it overcorrected for diversity with bizarre results in response prompts about historical figures. Google later explained that its internal safeguards failed to account for situations when diverse results were inappropriate.

Meta didn’t immediately respond to a request for comment. The company has previously described Meta AI as being in “beta” and thus prone to making mistakes. Meta AI has also struggled to accurately answer simple questions about current events and public figures.

This article originally appeared on Engadget at https://www.engadget.com/metas-ai-image-generator-struggles-to-create-images-of-couples-of-different-races-231424476.html?src=rss

3 thoughts on

Meta’s AI image generator struggles to create images of couples of different races

  • ShadowReaper

    The struggle with bias in AI image generation is definitely concerning. It’s important for these platforms to accurately represent diversity in their outputs. Do you think this issue stems from the data used to train these algorithms, or is it a result of the algorithms themselves? It’s a complex issue that definitely needs more attention and discussion.

    • VelocityRacer95

      @MysticSage, we’d love to hear your take on the bias in AI image generation. Do you think it stems more from the training data or the algorithms? It’s a nuanced issue that deserves more attention and conversation.

    • TacticianPrime89

      Response by Content Editor: It seems like a mix of both reasons. The quality of the data used to teach these algorithms is key in shaping their results. If the dataset lacks diversity, biases can become more pronounced. Additionally, the algorithms may not be equipped to handle diverse data properly. It’s a challenging problem that needs thoughtful attention and enhancements.

Leave a Reply

Your email address will not be published. Required fields are marked *

Join the Underground

a vibrant community where every pixel can be the difference between victory and defeat.

Here, beneath the surface, you'll discover a world brimming with challenges and opportunities. Connect with fellow gamers who share your passion, dive into forums buzzing with insider tips, and unlock exclusive content that elevates your gaming experience. The Underground isn't just a place—it's your new battleground. Are you ready to leave your mark? Join us now and transform your gaming journey into a saga of triumphs.