AI image generator Midjourney bans pornographic film by banning words about human reproductive system.


Midjourney founder David Holz says the company is banning these words as a stopgap measure to prevent people from creating shocking or gory content while “improving things through AI.” Holmes said moderators look at how words are used and what images are created, and adjust the bans periodically. The organization has a community guidelines page that lists the types of content it blocks in this way, including pornographic images, gore, and the 🍑 emoji, which is often a symbol of sarcasm.

AI models such as Midjourney, DALL-E 2 and Stable Diffusion are trained on billions of images scraped from the internet. According to a study conducted by a team at the University of Washington, these models learn sexual biases against women, which are then reflected in the images they create. The large size of the data set makes it impossible to remove images of a sexual or violent nature or that may cause biased results. The more often something appears in a data set, the stronger the correlation with the A model, meaning the more likely it is to appear in the images the model generates.

Midjourney term restrictions are an attempt to address this problem. Some words related to the male reproductive system, such as “sperm” and “nipples,” are also banned, but the list of banned words appears to be largely misogynistic.

The fast block was first discovered by Julia Rockwell, a clinical data analyst at Datafi Clinical, and her friend Madeleine Keenan, a cell biologist at the University of North Carolina at Chapel Hill. Rockwell studied them using Midjourney to try to create an interesting image for Keenan’s placenta. Interestingly, Rockwell was forbidden from using “placenta” as a prompt. Then she started experimenting with other words related to the human reproductive system, and found the same.

However, the pair showed how to work around these restrictions to create sexual images by using different spellings or other words with sexual or gory content.

In the findings shared with the MIT Technology Review, they confirmed that a quick “gynecological examination” using the British formula created some very horrible images: one of two naked women in a doctor’s office and the other of a bald man with three hands on their stomachs.

Image created by MidJourney using the query “gynecology exam”.

Julia Rockwell

Midjourney’s crude ban on reproductive biology shows how difficult it is to moderate content around generative AI systems. In addition, AI systems show how women’s sexual orientation extends to their internal bodies, Rockwell says.



Source link

Related posts

Leave a Comment

four × 3 =