My picture was used in child abuse images. AI is putting others through my nightmare

When I was a little girl, there was nothing scarier than a stranger.

In the late 1980s and early 1990s, kids were told, by our parents, by TV specials, by teachers, that there were strangers out there who wanted to hurt us. “Stranger Danger” was everywhere. It was a well-meaning lesson, but the risk was overblown: most child abuse and exploitation is perpetrated by people the children know. It’s much rarer for children to be abused or exploited by strangers.

Rarer, but not impossible. I know, because I was sexually exploited by strangers.

From ages five to 13, I was a child actor. And while as of late we’ve heard many horror stories about the abusive things that happened to child actors behind the scenes, I always felt safe while filming. Filmsets were highly regulated spaces where people wanted to get work done. I had supportive parents, and was surrounded by directors, actors, and studio teachers who understood and cared for children.

The only way show business did endanger me was by putting me in the public eye. Any cruelty and exploitation I received as a child actor was at the hands of the public.

“Hollywood throws you into the pool,” I always tell people, “but it’s the public that holds your head underwater.”

Before I was even in high school, my image had been used for child sexual abuse material (CSAM). I’d been featured on fetish websites and Photoshopped into pornography. Grown men sent me creepy letters. I wasn’t a beautiful girl – my awkward age lasted from about age 10 to about 25 – and I acted almost exclusively in family-friendly movies. But I was a public figure, so I was accessible. That’s what child sexual predators look for: access. And nothing made me more accessible than the internet.

It didn’t matter that those images “weren’t me”, or that the fetish sites were “technically” legal. It was a painful, violating experience; a living nightmare I hoped no other child would have to go through. Once I was an adult, I worried about the other kids who had followed after me. Were similar things happening to the Disney stars, the Strangers Things cast, the preteens making TikTok dances and smiling in family vlogger YouTube channels? I wasn’t sure I wanted to know the answer.

When generative AI started to pick up a few years ago, I feared the worst. I’d heard stories of “deepfakes”, and knew the technology was getting exponentially more realistic.

Then it happened – or at least, the world noticed that it had happened. Generative AI has already been used many times to create sexualized images of adult women without their consent. It happened to friends of mine. But recently, it was reported that X’s AI tool Grok had been used, quite openly, to generate undressed images of an underage actor. Weeks earlier, a girl was expelled from school for hitting a classmate who allegedly made deepfake porn of her,