Meta expands teen accounts to Facebook and Messenger, critics say more needs to be done

One year after launching teen accounts for Instagram, Meta is expanding the program to Facebook and Messenger. The company said the move is part of its ongoing effort to keep kids safer online.

With teen accounts, users under 18 are automatically enrolled with built-in protections.

Meta says 97% of teens under 16 are staying within those restrictions.

The company also highlights features such as sleep mode and supervision tools, which let parents set daily time limits and monitor activity.

“Teen accounts are really meant to respond to some of the top concerns that we’ve heard from parents,” Jennifer Hanley, Meta’s North American head of safety policy, told WTOP in September.

The accounts ensure teens under 16 need their parents’ permission to change the restrictions, according to Hanley. Among the offerings are tools that keep kids from engaging on the platforms for long periods.

“After 60 minutes, a teen in the teen account gets a notification encouraging them to leave the platform,” Hanley said.

But not everyone is convinced to tools are helping. A report from Cybersecurity for Democracy labeled 64% of the safety tools “red” because they fell short.

The report’s authors, which included a former Facebook employee, said the tools were rated that way because they were either “no longer available or ineffective.”

The report also warned that teens still encounter harmful “rabbit holes,” including imagery of self-harm.

Hanley said Meta disagrees with the report and pushed back on the findings.

“We’ve been overwhelmingly hearing great things from parents,” she said. “We know that teens are spending less time on our platforms, they’re seeing less sensitive content and they’re having less unwanted contact as a result of being in teen accounts.”

Meta said it remains open to feedback and continues to improve its safety tools.

“We’re always open to constructive feedback,” Hanley said.

PG-13 content guidelines introduced

After the September interview with WTOP, Meta announced an update to teen accounts.

The tech company said Instagram will now guide teen content using PG-13 movie ratings by default. That means content seen by teens will be similar to PG-13 movies and teens won’t be able to opt out without a parent’s permission, according to Meta.

Parents who want more control can choose a stricter setting, Meta said, and they’ll also have new ways to report content they think teens shouldn’t see.

In a blog post, Meta called this “the most significant update” since teen accounts launched, saying it was shaped by feedback from thousands of parents worldwide.

The company also said it will use age prediction technology to place teens into protections even if they lie about their age when signing up.

Meta acknowledged in the post that “no system is perfect,” but said it’s committed to improving and keeping age-inappropriate content away from teens.

Support for schools added

Hanley also said Meta is expanding its efforts to help schools.

Through its School Partnership Program, middle and high schools in the U.S. can sign up to get educational resources and tools to report harmful content more easily. Schools that enroll receive a verified badge and access to expedited content review.

Meta said educators are often in the best position to spot issues such as bullying, and the program is designed to help them flag and address those concerns more effectively.

Get breaking news and daily headlines delivered to your email inbox by signing up here.

© 2025 WTOP. All Rights Reserved. This website is not intended for users located within the European Economic Area.

Mike Murillo

Mike Murillo is a reporter and anchor at WTOP. Before joining WTOP in 2013, he worked in radio in Orlando, New York City and Philadelphia.

Federal News Network Logo
Log in to your WTOP account for notifications and alerts customized for you.

Sign up