The U.S. surgeon general warned Tuesday of the potential risks of social media to young people, and urged policy makers and technology companies to strengthen standards for adolescents.
Dr. Vivek Murthy’s office said a growing body of research shows the detrimental effects of social media on adolescents. The office’s public advisory said more research is needed to better understand the effects of social media on children and teens.
The advisory noted the benefits of social media, including its use as an outlet for creativity and for finding community. But, the report said, “there are ample indicators that social media can also have a profound risk of harm to the mental health and well-being of children and adolescents.”
“We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis—one that we must urgently address,” Murthy said in a statement. The surgeon general said advisories like the one released Tuesday are for significant public health issues that require the nation’s immediate attention. It offers suggestions for policymakers, but doesn’t directly make changes to laws.
Children and teens are particularly vulnerable from ages 10 to 19 as their brains continue to develop, according to the report. The advisory cites a 2022 Pew Research Center survey that found 95% of teens use a social-media platform, and more than a third use at least one “almost constantly.”
The advisory pointed to several studies examining a range of adverse effects of social media on adolescents. Those include online harassment, increased exposure to content related to self-harm and racism, and negative impacts on sleep, body image, and physical activity. One 2019 study cited in the report found U.S. teenagers between the ages of 12 and 15 who spent more than three hours a day on social media experienced twice the risk of symptoms of depression and anxiety.
The surgeon general’s office didn’t name any social-media companies in its warning.
TikTok and Meta Platforms, which owns Facebook and Instagram, declined to comment Tuesday. Twitter didn’t comment.
When asked about children and mental health at a March 2021 congressional hearing, Meta Chief Executive Mark Zuckerbergsaid connecting with others on apps can have positive mental health benefits. The company has said it has developed tools to support teens and families as they navigate social-media use and age-verification technology that helps teens have age-appropriate experiences.
TikTok, the popular short-form video app owned by ByteDance, added a 60-minute screen-time limit for users under the age of 18 in March.
Snapchat said in a statement Tuesday: “As a messaging service for real friends, we applaud the surgeon general’s principled approach to protecting teens from the ills of traditional social media platforms.”
The surgeon general’s report includes recommendations for addressing the issue, including for companies, families and lawmakers. It asks policy makers to improve safety standards, including increasing data privacy for adolescents. The advisory also urges technology companies to take responsibility for the effects of their platforms, share relevant data with researchers and the public and improve practices for responding to complaints swiftly.
“Our children and adolescents don’t have the luxury of waiting years until we know the full extent of social media’s impact,” the office said.
Government officials, lawmakers and technology companies have for years grappled with how best to manage the issue. Lawmakers in the U.S. and Europe are weighing plans to tighten online age restrictions. Utah Gov. Spencer Cox, a Republican, recently signed a law that will require social-media companies to verify users are 18 years or older, and require those under age 18 to receive the consent of a parent or guardian to open an account.
The Wall Street Journal’s Facebook Files series in 2021 showed internal research at the company found Instagram was harmful for a percentage of young users, primarily teenage girls with body-image concerns. Internal research reviewed by the Journal in 2021 showed the platform made body-image issues worse for a third of teenage girls. Facebook, which became Meta in 2021, scrapped plans to create an Instagram platform tailored to children after lawmakers and others raised concerns over the popular app’s impact on young people’s mental health.
A Meta spokesman said at the time that the investigation was premised on a misunderstanding of issues that also affect other social-media platforms.
Instagram has added more protections for teens in recent years, such as automatically making new accounts for those under 16 years old private.