(NewsNation) — A nude photo controversy has a New Jersey community demanding answers. High school students say they were stunned to learn that inappropriate photos of them were circulating among their classmates — mostly because the photos were completely fake.
“This is cyberbullying and bullying at the next level,” AI advisor and business strategist Marva Bailer said.
A group of boys at Westfield High School in New Jersey have been accused of using artificial intelligence (AI) to generate pornographic pictures of female students at the school and then sharing them in group chats. Police are now investigating, but experts say this kind of problem is only becoming more prevalent.
“These boys think they’re having fun and making these creations and it’s not real photography, but it’s a real image and likeness of these girls and it’s putting them in a scenario and a situation that is causing them harm and probably mental distress,” Bailer said.
The Westfield public schools superintendent released a statement to local media:
“All school districts are grappling with the challenges and impact of artificial intelligence and other technology available to students at any time and anywhere. We continue to strengthen our efforts by educating our students and establishing clear guidelines to ensure that these new technologies are used responsibly in our schools and beyond.”
Bailer said she wasn’t surprised that people were taking advantage of AI technology to harm others when they think it’s a joke.
“It’s not a joke,” she said.
According to Sensity AI, which offers deepfake detection services, 90% of all deepfakes on the internet are pornography. Experts say child pornography in particular is a growing problem, as online AI tools are increasingly available for free with little more than a Google search.
“Now, in the digital age we live in, information is moving so quickly and the trust factor because you’re looking at it so quickly. You trust everything. Once it’s out there, it’s out there and so it is going to be very, very challenging for law enforcement to find bad actors,” Bailer said.
Earlier this week, President Joe Biden signed an executive order on artificial intelligence. He added, “To realize the promise of AI and avoid the risk, we need to govern this technology.”