Experts say more needs to be done to help protect children from online predators

Experts say more needs to be done to help protect children from online predators
Carol Yepes/Getty Images

(NEW YORK) — Protecting children and teens from online dangers is no easy feat. It’s an ever-changing landscape and in many ways, it’s easier than ever for predators to target children online.

In 2023, reports of child online exploitation made to the National Center for Missing & Exploited Children (NCMEC) Cyber Tip Line rose by more than 12%, surpassing 36.2 million reports.

“Child predators used to be the person who was going around driving in a creepy van, going by schools offering candy to kids,” New Jersey State Police Lt. Paul Sciortino told ABC News. “The internet has become that van.”

Sciortino works on the Internet Crimes Against Children Task Force and investigates cyber tips that come from NCMEC. He told ABC News that the dangers that lurk online can often be overlooked by kids and parents.

Titania Jordan, the Chief Parent Officer at Bark, a technology company that aims to help parents and educators keep children safe online by offering advanced monitoring services through an app and a phone, said there are hundreds of thousands of predators.

“The estimated half a million predators that are known to be online at any time know the internet is a vast playground for children that are online [an] upwards [of] eight hours a day, most of them unmonitored, unfiltered, unrestricted,” Jordan told ABC News in an interview.

Bark says they use artificial intelligence to analyze the content and context of children’s digital signals, whether it’s text, social media, email or browsing.

“You’re giving [children] access to the entire world, and you’re giving the entire world access to them,” continued Jordan, who spoke about how Bark’s app allows parents to receive alerts if their children view something potentially harmful.

“And currently, the laws in place — at least the United States — are not holding social media are big tech accountable. They are not liable for what’s happening to children on their platform,” she added.

In January, the CEOs of five major social media companies — TikTok, Meta, X, Snapchat and Discord — faced the U.S. Senate Judiciary Committee to speak to politicians on protecting kids from sexual exploitation online.

“This disturbing growth in child sexual exploitation is driven by one thing: changes in technology. … Smartphones are in the pockets of seemingly every man, woman and teenager on the planet,” U.S. Senate Majority Whip Dick Durbin said in his opening statement during the hearing.

“These apps have changed the way we live, work and play. But, as investigations have detailed, social media and messaging apps have also given predators powerful new tools to sexually exploit children,” he said.

During the hearing, parents sat in the room holding photos of their children who had fallen victim to predators and scammers on those very platforms, many no longer alive.

“I’m sorry for everything you have all been through,” Meta CEO Mark Zuckerberg told families in the room. “No one should go through the things that your families have suffered, and this is why we invest so much, and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.”

Platforms like Meta and Snapchat have introduced new tools they say will help protect teens from unwanted contact and their parents monitor their online activities.

“Child exploitation is a horrific crime,” a Meta spokesperson said in a written statement to ABC News. “We’ve spent years supporting law enforcement in investigating and prosecuting the criminals behind it and have developed over 50 tools, features and resources to protect teens, including to help prevent unwanted contact.”

“Our goal is to make it as hard as possible for young people to be contacted by people they don’t know, which is why we have extra safeguards for teens to protect against unwanted contact, offer easy blocking tools, and send in-app warnings if someone they don’t know tries to contact them,” said a Snapchat spokesperson.

Experts like Jordan say that while these family parental controls “sound great” they don’t give parents or caregivers “enough of the critical insight that you need to keep your child safe.” She argues that too often children are left responsible for making decisions about their online safety.

“We limit driving age, alcohol and tobacco use, et cetera, but when it comes to the internet, social media gaming, screen time, even, we’re leaving it up to the children without giving parents full access and ability to help them become responsible digital natives,” Jordan said.

Copyright © 2024, ABC Audio. All rights reserved.