Meta expands ‘Teen Accounts’ to Facebook, Messenger amid children’s online safety regulatory push

Meta Platforms is rolling out its “Teen Accounts” feature to Facebook and Messenger on Tuesday, as it faces sustained criticism about not doing enough to protect young users from online harms.

The enhanced privacy and parental controls, which were introduced on Instagram last year, will address concerns about how teens are spending their time on social media, the company said.

Make sense of the latest ESG trends affecting companies and governments with the Reuters Sustainable Switch newsletter. 

Meta’s expansion of safety features for teens comes as some lawmakers say they plan to press ahead with proposed legislation, such as the Kids Online Safety Act (KOSA), seeking to protect children from social media harms.

Meta, ByteDance’s TikTok and Google’s YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictive nature of social media.

In 2023, 33 U.S. states including California and New York sued the company for misleading the public about the dangers of its platforms.

Meta said teens under 16 will require parental permission before they can go live and disable a feature that automatically blurs images potentially containing nudity in direct messages.

“We will start including these updates in the next couple of months,” the company said.

Reuters

Please follow and like us:
Social Share Buttons and Icons powered by Ultimatelysocial