Skip to main content
Clear icon
49º

Using artificial intelligence to make inappropriate images raises questions about the dangers of the program

Cyber tip led to the investigation of a registered sex offender accused of using AI to create child abuse images

COLUMBIA COUNTY, Fla. – The arrest of a 26-year-old registered sexual predator who Columbia County deputies said is accused of having child pornography is raising questions about explicit images created by artificial intelligence.

Randy Cook, of Lake City, is a registered sexual predator who was convicted of lewd and lascivious molestation. During the investigation, Cook told investigators that he was a part of a group called “Make Loli Legal.”

Cyber security expert Chris Hamer said federal and state laws are in place to protect children from being sexually exploited on the internet.

“If there is no actual victim that this process has harmed, then the law cannot prosecute,” Hamer said. “The gray area is only in the interpretation of the image. Is the image itself actually harmful? Is the possession of it causing harm to another individual? Since no one posed or was abducted, molested, or abused to generate the image, there is no injured party.”

RELATED: AI-generated child pornography found in possession of Lake City man sparks investigation that leads to arrest: CCSO

According to an arrest report, Cook told a detective that the group was a fantasy group with anime images that appeared to be children. The images Cook reportedly showed the detective were sexually explicit images of children that were AI-generated.

Hamer said Loli is a graphic style that portrays older women as underage girls.

“They dress younger. They behave younger. Squeaky voice. Babyish behavior. So, they blur the line between what is a minor and what is not,” Hamer said.

The report said Cook told investigators that the images helped him cope with his sexual issues.

Hammer said what Cook told the detective is no different than saying he was self-medicating to treat an addiction.

“The danger is people that have this addiction or glitch in their functions don’t get appropriate treatment. And they continue to build fantasies around the images they have generated,” Hamer said.

Hamer said it’s easy to generate images with AI as the software uses specific data to create images in under a minute.

Several AI programs will flag offensive words associated with child pornography. Still, some people with child fetishes have found a workaround by inputting other words that the AI program does not recognize to trick the system.


About the Author
Erik Avanier headshot

Award-winning broadcast and multimedia journalist with 20 years experience.

Loading...