Meta board probing Facebook, Instagram’s deepfake porn response

Meta’s oversight board is reportedly investigating Facebook and Instagram’s handling of two instances where artificial intelligence-generated pornographic images, including one of an “American public figure,” circulated its popular social media sites.

Meta’s board only provided descriptions of the deepfake images in question but did not name the famous women depicted in them in order to “prevent further harm,” a board spokesperson said.

Deepfakes of celebrities have recently become rampant, withTaylor Swift and US Rep. Alexandria Ocasio-Cortez among the victims of AI-generated deepfake porn in recent months.

Meta’s oversight board — which is funded by the social media giant but reportedly operates independently — said that it will use two examples of so-called “deepfakes” to assess the overall effectiveness of its and enforcement practices. REUTERS

According to Meta’s oversight board’s descriptions of its cases, one involves an AI-generated image of a nude woman resembling a public figure from India, posted by an account on Instagram that only shares AI-generated images of Indian women.

The other image, the board said, appeared in a Facebook group for sharing AI creations and featured an AI-generated depiction of a nude woman resembling “an American public figure” with a man groping her breast.

Meta removed the image depicting the American woman for violating its bullying and harassment policy, which bars “derogatory sexualized photoshops or drawings,” but initially left up the one featuring the Indian woman and only reversed course after the board selected it for review.

The board, which is funded by Meta but opertes independently from the social media giant, will use the two deepfake examples to assess the overall effectiveness of Meta’s policies and enforcement practices around pornographic fakes created using artificial intelligence, it said in a blog post.

Representatives for Meta did not immediately respond to The Post’s request for comment.

Just last week Ocasio-Cortez, 34, opened up about her own horrifying experience of finding a fake image of herself performing a sex act — as she was scrolling through X while talking about legislation with her aides in a car in February.

“There’s a shock to seeing images of yourself that someone could think are real,” the Queens Democrat told told Rolling Stone. “As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma, while I’m trying to … in the middle of a f—king meeting.”

The mental picture of her deepfake version placing her mouth over another’s genitals stayed with Ocasio-Cortez for the rest of the day, she said.

Rep. Alexandria Ocasio-Cortez said she was scrolling through X when she came across an AI-generated image of herself (above) performing a sex act in February.

Earlier this year, Swift was also the subject of a series of X-rated photos showing the pop sensation in various sexualized positions at a Kansas City Chiefs game — a nod to her highly publicized romance with the team’s tight end Travis Kelce — that were created using AI.

As quickly as the images began trending on X back in January, Swifties came together and tried to bury the images by sharing an influx of positive posts about the 34-year-old songstress.

It only took hours for users to track the crude images back to an account under the handle @FloridaPigMan, which no longer bears any results on X.

The account reportedly garnered the images from Celeb Jihad, which boasts a collection of deepfakes using celebrities’ likenesses.

After Taylor Swift also became a victim of AI porn, loyal Swifties were wondering how the images were not consideredsexual assault — and called for more regulation around the sharing of fake nude images. dpa/picture alliance via Getty Images

The incident triggered a renewed push for legislators to implement more stringent policies surrounding AI’s use and the prevalence of deepfake porn.

Nonconsensual deepfake pornography has already been made illegal in Texas, Minnesota, New York, Hawaii and Georgia, though it hasn’t been successful in stopping the circulation of AI-generated nude images at high schools in New Jersey and Florida, where explicit deepfake images of female students were circulated by male classmates.

In January, Rep. Joseph Morelle (D-NY) and Tom Kean (R-NJ) reintroduced a bill that would make the nonconsensual sharing of digitally altered pornographic images a federal crime, with imposable penalties like jail time, a fine or both.

Reps. Joseph Morelle and Tom Kean are pushing to pass the “Preventing Deepfakes of Intimate Images Act,” which would make the nonconsensual sharing of digitally altered pornographic images a federal crime.
AFP via Getty Images

The “Preventing Deepfakes of Intimate Images Act” was referred to the House Committee on the Judiciary, but the committee has yet to make a decision on whether or not to pass the bill.

Aside from making the sharing of digitally-altered intimate images a criminal offense, Morelle and Kean’s proposed legislation also would allow victims to sue offenders in civil court. 

The law has yet to be passed in the House of Representatives. It would still have to get approvals by US Senators as well as the president if it were to be enacted.

With Post wires

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Chronicles Live is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – chronicleslive.com. The content will be deleted within 24 hours.

Leave a Comment