شناسهٔ خبر: 69381249 - سرویس سیاسی
نسخه قابل چاپ منبع: گاردین | لینک خبر

A predator used her 12-year-old face to make porn. She helped pass a law to make that a crime

Child actor Kaylin Hayman fought back after she learned that a man had used AI to make child sex abuse materials from images on her Instagram page

صاحب‌خبر -
A young woman looks to the side. View image in fullscreen
Kaylin Hayman in Ventura, California, on 3 October. Photograph: Leafy Yun Ye/The Guardian

Last year, Kaylin Hayman walked into a Pittsburgh court to testify against a man she’d never met who had used her face to make pornographic pictures with artificial intelligence technology.

Kaylin, 16, is a child actress who starred in the Disney show Just Roll With It from 2019 to 2021. The perpetrator, a 57-year-old man named James Smelko, had targeted her because of her public profile. She is one of about 40 of his victims, all of them child actors. In one of the images of Kaylin submitted into evidence at the trial, Smelko used her face from a photo posted on Instagram when she was 12, working on set, and superimposed it onto the naked body of someone else.

“I’ve had my fair share of crying uncontrollably because I don’t understand how some people are so evil,” she tells the Guardian in an interview. “I can never really wrap my head around that.”

Child predators are using AI to create sexual images of their favorite ‘stars’: ‘My body will never be mine again’
Read more

Kaylin lives in Ventura, California, and Smelko was based in Pennsylvania when he committed these crimes against her. She was shocked when she learned her case could only be brought to trial because it was an interstate crime. Possessing depictions of child sexual abuse is criminalized under US federal law. But under California state laws, it wasn’t considered illegal.

Kaylin turned her horror into action. This year, she became a staunch public advocate in support of a new California bill, AB 1831, that expands the scope of existing laws against child sexual abuse material (CSAM) to include images and videos that are digitally altered or generated via AI. In June, she testified in support of the bill at the state capitol in Sacramento.

“I talked about how I felt violated and that I was absolutely appalled that this wasn’t already a crime in California,” says Kaylin. “California is such a huge part of the acting industry, and there are so many kids who were not protected from this crime.”

At the end of September, California’s governor, Gavin Newsom, signed the measure into law. Child predators creating such material can face imprisonment and fines of up to $100,000 in the state.

While the new law focuses on AI in the hands of child predators, other factors in Kaylin’s life put her at risk of encountering Smelko or those like him, according to her and her parents, Mark and Shalene Hayman.

The Hayman family in Ventura, California, on 3 October. View image in fullscreen
The Hayman family in Ventura, California, on 3 October. Photograph: Leafy Yun Ye/The Guardian

Kaylin was 10 years old when she first got her Instagram account. The social network requires that its users must be at least 13 to sign up with the exception of accounts managed by parents. Smelko downloaded photos from her profile to create sexual images that combined her face with naked bodies of other girls and women.

“Disney set up her Instagram account specifically to promote the show and themselves,” says Mark. “But when these companies are employing these kids and making them post on there and not providing support – that’s where the bigger issue lies.”

This support should include training on dealing with harassment and blocking accounts, and counseling, he says. Kaylin likewise lays the blame at Disney’s feet.

“Disney’s PR team had me and all of the kids at Disney sign up for an app. They used to send us clips to post on Instagram every week that an episode would come out,” says Kaylin. “It started with my job and them planting that seed. I would like them to take some responsibility, but that has yet to happen.”

In recent years, men have harassed Kaylin via her Instagram and TikTok accounts by sending her nude photos. She has reported the unwanted messages to both social media companies, but she says no action has been taken.

“She’s certainly had her fair share of creepy stalkers who continue to taunt her,” says Shalene.

The California state capitol in Sacramento. View image in fullscreen
The California state capitol in Sacramento. Photograph: Backyard Productions/Alamy

Mark believes that Sag-Aftra, the Hollywood actor’s union, also needs to be more proactive in educating its members on the risks of predators using AI and social media to victimize public figures. Both parents regularly check Kaylin’s accounts, which she still uses and has access to.

“We do read a lot of comments and think, ‘What is wrong with people?’, but I don’t know if you can get away from it. It’s difficult to be in this industry and not be on social media,” says Shalene. “I would like to see the social media companies do some responsible censoring and protections.”

Over the past few years, Instagram has announced several initiatives to increase protections for its users under 16, including parental controls and measures to determine who can message them. In September, the company announced it would make all accounts for users under 18 private by default, a move praised by child safety advocates. The same restrictions apply to minors’ verified accounts, according to Meta’s guidelines.

“There are so many inappropriate images circulated on Instagram. I just don’t understand why they are able to be sent to kids,” says Kaylin, who turns 17 this month. “Instagram should be like, ‘No, that’s not allowed,’ and take it down. But it doesn’t happen, and I don’t understand.”

Meta said in a statement: “We have detailed and robust policies against child nudity and exploitation, including real and images and those created using GenAI.”

A teen girl testifies in court View image in fullscreen
Kaylin Hayman testifies in court in support of the AB 1831 bill at a public safety committee hearing in Sacramento, California, in April. Photograph: Courtesy Ventura County District Attorney’s Office

“SAG-AFTRA has been educating, bargaining, and legislating about the dangers of deepfake technology since at least 2018, ” said Jeffrey Bennett, the general counsel for SAG-AFTRA. Bennett pointed to the guild’s publication of a magazine on deepfakes and participation in panels and published articles on the topic.

Disney did not offer comment.

The circulation of CSAM is on the rise online. Predators have used photo editing software in the past, yet recent advancements in AI models offer easy-access opportunities to mass produce more realistic abuse images of children. In 2023, the National Center for Missing & Exploited Children (NCMEC), a US-based clearinghouse for the global reporting of CSAM, received 36.2m reports of child abuse online, up 12% from the previous year. Most of them came from Meta.

While most of these reports received were related to real-life photos and videos of sexually abused children, the NCMEC also received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI. The organization has been critical of AI companies for not actively trying to prevent or detect the production of CSAM.

Kaylin says that discovering her face had been used to create CSAM signaled the end of her childhood innocence. She is now more nervous about her safety and that of other children and teens she knows.

“If I see a man or somebody who looks at me a little bit weird or oddly, I’m always on edge,” she says. “I’m always thinking about the worst that can happen in certain situations. I think it’s something young women have had to get used to. It’s unfortunate that I had to have that wake-up call at 16. I guess it’s just part of life,” she adds.

A young woman underneath a tree. View image in fullscreen
‘I’m always thinking about the worst that can happen in certain situations,’ Kaylin Hayman says. Photograph: Leafy Yun Ye/The Guardian

A year ago, giving her testimony at Smelko’s trial signified her taking back some control over the situation, she says. In court, while she kept her focus on answering the prosecutor’s questions and faced in the direction of the jury, she shot one quick glance at the stranger standing trial for sexually exploiting her.

“When I did get a glimpse of him, it looked like he had a really sad life and he probably stayed inside for a lot of it because he was not a first-time felon,” she says. After she testified, Smelko was convicted of two counts of possessing child pornography.

Kaylin is determined to continue acting and wants to appear in movies someday. But right now she is focused on finishing her senior year of high school and her advocacy work against online child exploitation. The ordeal has also sparked a new ambition for her. She wants to go to law school so she can one day become an attorney specializing in children’s rights.

“I’m very fortunate that my case wasn’t worse. I know a lot of people have it worse than me,” she says. “I’m trying to add a little bit of good to something so bad.”

In the US, call or text the Childhelp abuse hotline on 800-422-4453 or visit their website for more resources and to report child abuse or DM for help. For adult survivors of child abuse, help is available at ascasupport.org. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International