Skip to main content

14-year-old AI cyber exploitation victim speaks out about mortifying experience on social media

14-year-old Texas cyber exploitation victim Elliston Berry, her mother Anna McAdams and Texas Sen. Ted Cruz relayed the horrors of A.I. nude deepfakes spreading across the web.

A sinister cyber exploitation scheme turned Texas 14-year-old Elliston Berry's life upside down last October after she discovered deepfake nude images of herself circulating across social media one morning.

To make matters worse, many of her classmates had already come across them.

"[I was] just expecting a normal day, and I got numerous calls and messages from my friends telling me that these nude images were going around," she recalled Wednesday on FOX Business' "Mornings with Maria."

"[They were going] all around school – Snapchat, Instagram, throughout all different social medias," she added.

OPENAI COLLEAGUES WARN RACE FOR AI COULD LEAD TO ‘HUMAN EXTINCTION’

Berry's mother, Anna McAdams, also expected a normal day until her daughter rushed into her room horrified and in tears.

"She came in there showing me the pictures, and I was mortified and, as a mom, stunned. I mean, I couldn't protect my daughter," she said Wednesday.

She told Maria Bartiromo the photos looked convincing. 

One of Berry standing on the deck of a cruise ship was altered to replace her clothing with a fake naked body. Other fake nudes of girls in her friend group, according to a Wall Street Journal report, were generated using original beach photos.

"I recognized where she was," McAdams said. "As a mom, I knew immediately where the first picture was, but… if I didn't know that, and I was just looking at the photo, I would probably think it was real."

Berry would later discover a male classmate had orchestrated the scheme by taking two photos from her private Instagram account and rendering her nude using A.I. software, the WSJ reported.

MOST AMERICANS EXPECT AI ABUSES TO IMPACT 2024 ELECTION: SURVEY

The same male student was also behind the incidents involving her friends. 

Berry told the outlet the ordeal left her feeling "shameful and fearful" at school, as she wondered whether students she encountered had seen the images and believed they were fake.

"I felt like I had to tell everyone that it wasn't real and that I was, I was just kind of fighting for what I thought was right," she told Bartiromo.

Exploitation schemes have emerged as a chief ethical challenge as A.I. continues to grow exponentially, working its way into the workforce, the creative realm and elsewhere.

It's left lawmakers with the pressure to do something about it.

Texas Republican Sen. Ted Cruz and Minnesota Democratic Sen. Amy Klobuchar are two among the bipartisan lot of lawmakers sponsoring the "Take it Down Act," which would criminalize the publication of non-consensual intimate imagery, require social media platforms to remove the content and more.

OPENAI STARTS TRAINING ‘NEXT FRONTIER’ ARTIFICIAL INTELLIGENCE MODEL, FORMS SAFETY COMMITTEE

"This is a sick and twisted pattern that is getting more and more common. There are thousands and thousands of people who are victims, just like Elliston," Cruz said while sitting down in-studio with Bartiromo, Berry and McAdams.

"Of the deep fakes that are online, up to 95% of them are non-consensual intimate images, and the technology is such that you can take…. [innocent images] and you can turn them into pictures, you can turn them into videos. With technology, you can't tell that they're fake. You think it's real. And this is a pattern that is being used to abuse [mostly women and teenage girls]."

Cruz said the bill aims to bring prison time to offenders – putting them behind bars for two years if the offense is committed against an adult and three years if the victim is a minor.

He also emphasized that such legislation would give social media platforms a legal obligation to step in and remove harmful imagery.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.