Justice Department officials say they’re aggressively going after offenders who exploit AI tools, while states are racing to ensure people generating “deepfakes” and other harmful imagery of kids can be prosecuted under their laws. With the recent significant advances in AI, it can be difficult if not impossible for law enforcement officials to distinguish between images of real and fake children. Lawmakers, meanwhile, are passing a flurry of legislation to ensure local prosecutors can bring charges under state laws for AI-generated “deepfakes” and other sexually explicit images of kids. Governors in more than a dozen states have signed laws this year cracking down on digitally created or altered child sexual abuse imagery, according to a review by The National Center for Missing & Exploited Children. Laws like these that encompass images produced without depictions of real minors might run counter to the Supreme Court’s Ashcroft v. Free Speech Coalition ruling. That case, New York v. Ferber, effectively allowed the federal government and all 50 states to criminalize traditional child sexual abuse material.
han and porn site investigated by Ofcom over online safety
There are many reasons why someone would sexually harm a child, and children are kept safer when we are informed about what increases risk in their relationships and environment. We are better prepared to speak up whenever someone is acting unsafely around a child, regardless of what we know about their mental health or attractions. But BBC News has investigated concerns that under-18s are selling explicit videos on the site, despite it being illegal for individuals to post or share indecent images of children.
If you find what you believe to be sexual images of children on the internet, report this immediately to authorities by contacting Cybertipline. If you or someone you know is concerned about their internet activity, seek the help of professionals who specialize in this area. Unlike physical abuse which leaves visible scars, the digital nature of child sexual abuse material means victims are constantly re-traumatised every time their content is seen. Once inside, they can find vast criminals networks, including those peddling child sexual abuse material on a massive scale, Mistri adds. Up to 3,096 internet domains with child sexual abuse materials were blocked in 2024 amid Globe’s #MakeItSafePH campaign.
The law enforcement operation was a “massive blow” against distributors of child pornography that would have a “lasting effect on the scene”, Mr Gailer said. “Our dedication to addressing online child abuse goes beyond blocking harmful sites. It involves a comprehensive approach that includes technological solutions, strong partnerships and proactive educational programs,” Globe’s chief privacy officer Irish Krystle Salandanan-Almeida said. Understanding more about why someone may view CSAM can help identify what can be done to address and stop this behavior – but it’s not enough.
Visual analysis of the images
By category among teen offenders, 40.2 percent of the violations were secretly taking pictures and video or persuading victims to send nudes. That was followed closely by violations for posting such content online, at 39.6 percent. Researcher Jessica Taylor Piotrowski, a professor at the University of Amsterdam, said that, nowadays, measures such as age restriction alone have not been effective. This issue was also raised by researcher Veriety McIntosh, an expert in virtual reality.
- “The website monetized the sexual abuse of children and was one of the first to offer sickening videos for sale using the cryptocurrency bitcoin,” the NCA said in a statement.
- These extra tags describe the sexual activity seen and enable our assessments to be compatible with multiple legal jurisdictions around the world.
- They plan to investigate buyers and sellers who used the website on suspicion of violating the Law Banning Child Prostitution and Child Pornography.
Laws about child pornography
Britain’s National Crime Agency (NCA) said the “Welcome to Video” site contained 250,000 videos that were downloaded a million times by users across the world. Police reported 3,035 cases of child pornography in 2022, up 66 from the previous year. Of those, police arrested 2,053 offenders and referred them to prosecutors–up 64 people year over year. The website was used “solely” to share pornographic images of children, chief investigator Kai-Arne Gailer told a press conference. Child pornography is now referred to as child sexual abuse material (CSAM) to more accurately reflect the crime being committed. He also called for greater online child safety, stressing how online behaviour could have long-term consequences.
This can often feel confusing for a young person as it may feel as if this person truly cares about them. The live-streaming nature of the material was particularly sickening, the institute’s report noted, because of the real-time element. Our experts explore the changes we can all make to help improve outcomes for children. Even though I was not physically violated,” said 17-year-old Kaylin Hayman, who starred on the Disney Channel show “Just Roll with child porn It” and helped push the California bill after she became a victim of “deepfake” imagery. The NSPCC Library and Information Service helps professionals access the latest child protection research, policy and practice resources and can answers your safeguarding questions and enquiries.
Compartir