See for yourself how biased AI image models are with these new tools


One theory for why this might be is that non-binary browns may have more visibility in the press recently, which means their images end up in the datasets used for training AI models, Jernite says.

OpenAI and Stable.AI, the company that built Stable Diffusion, have introduced tweaks to their systems to address ingrained biases, such as blocking certain queries that appear likely to generate offensive images. However, these new tools show how limited these fixes are in the face of adoption.

A spokesperson for Stability.AI told us that the company trains its models “on data sets specific to different countries and cultures,” which “should serve to reduce bias caused by over-representation in general data sets.”

An OpenAI spokesperson wouldn’t comment specifically on the tools, but pointed us to a blog post that describes how the company has added various techniques to DALL-E 2 to filter out biased and sexualized and violent images.

As these AI models become more widely accepted and produce more realistic images, bias is becoming a more pressing problem. They are already being released in many products like stock photos. Luccioni said she fears the models will reinforce harmful biases at large. She hopes the tools she and her team have developed will bring more transparency to image-generating AI systems and highlight the need to be less biased.

Part of the problem is that these models are trained on mostly US-centric data, meaning they mostly reflect US associations, biases, values ​​and culture, says Aileen Kaliscan, an associate professor at the University of Washington who studies biases in AI systems. and was not involved in this study.

“The last thing is that this online American culture is a thumbs up … it’s continued all over the world,” says Kaliscan.

Caliscan says face-hugging can help AI developers understand and reduce bias in their AI models. “I believe that when people see these examples firsthand, they can better understand the importance of these biases,” she says.



Source link

Related posts

Leave a Comment

5 × one =