(NEW YORK) — Social media platform TikTok will set a default 60-minute screen time limit for users below age 18, the company announced on Wednesday.
When the one-hour limit expires, young users will be prompted to enter a password that allows them to continue viewing the app, the company said.
Young users can opt out of the daily 60-minute screen limit, but some high-usage viewers will be asked to set a daily screen time limit of their choice, said TikTok, which is owned by the Chinese company ByteDance.
“We believe digital experiences should bring joy and play a positive role in how people express themselves, discover ideas, and connect,” Cormac Keenan, TikTok’s Head of Trust and Safety, said in a statement.
TikTok, which has more than 100 million monthly active users in the U.S., has faced growing scrutiny from state and federal officials over fears that American data could fall into the possession of the Chinese government.
The app has also encountered sharp criticism over the risks it may pose to the mental health of young people.
Republican Rep. Michael Gallagher, of Wisconsin, who chairs a House select committee on China, told NBC’s “Meet the Press” last month that he considers TikTok “digital fentanyl.”
“It’s highly addictive and destructive and we’re seeing troubling data about the corrosive impact of constant social media use, particularly on young men and women here in America,” he added.
Last year, a bipartisan group of state attorneys general launched a nationwide investigation into the mental health effects of TikTok for young users.
Scrutiny over the harmful effects of content on social media, especially for young people, intensified after leaks from whistleblower Frances Haugen in 2021 revealed that an internal Facebook study had shown damaging mental health effects of Instagram for teen girls.
The Centers for Disease Control and Prevention released a study last month that showed elevated rates of poor mental health in young people, especially young girls.
In 2021, nearly 60% of high school girls experienced a persistent feeling of sadness or hopelessness over the previous year, the study found, adding that almost 25% of high school girls made a suicide plan over that period.
In July, TikTok announced plans for a rating system aimed at protecting young users from inappropriate content.
Wisconsin Sen. Tammy Baldwin and Minnesota Sen. Amy Klobuchar, both Democrats, last February sent a letter to TikTok saying its “algorithm of ‘nonstop stream of videos’ increases the likelihood that viewers will encounter harmful content even without seeking it out.”
The letter followed an investigation from The Wall Street Journal that found the platform surfaced tens of thousands of weight loss videos to a dozen automated accounts registered as 13 year olds within a few weeks of their joining the app.
Other social media platforms have also faced criticism over their effect on young people’s mental health. In September 2021, Facebook suspended plans to offer a version of Instagram for kids.
The following month, officials from Snapchat, TikTok and YouTube told lawmakers they would work with them on proposals to help protect young users from harmful content on their platforms.
TikTok CEO Shou Zi Chew is scheduled to appear before the House Energy and Commerce Committee in March on the company’s data security practices, the committee said last month.
Copyright © 2023, ABC Audio. All rights reserved.