Meta Adds New Teen Safety Features to Prevent Inappropriate Content and Strengthen Protections in Private Messages
Meta has introduced new safety features for teens, including stricter message controls and account removals, as part of ongoing efforts to combat exploitation and improve child online safety.

Meta has introduced new safety features to better protect teens on its platforms, especially in private messaging and harmful content prevention. These modifications aim to limit the possibility of harmful or exploitative content, particularly in direct messages (DMs). The upgrade also provides youngsters with new options to keep safe, such as showing information about who they are chatting with, including when the account was established and whether it appears questionable.Teens may now block and report accounts using a single press.
In June alone, Meta claims that it blocked or reported 2 million accounts in response to safety alerts. This measure is part of a bigger attempt to protect teens from online risks, particularly after the firm received criticism by lawmakers for unable to do enough to prevent online exploitation.
Earlier this year, Meta banned over 135,000 Instagram accounts for offensive content.These accounts frequently commented inappropriately or requested explicit pictures from profiles managed by adults on behalf of youngsters.Approximately 500,000 connected accounts were removed from both Instagram and Facebook.
Meta now automatically sets teen and child-representing accounts to the greatest level of protection by default.This filters out inappropriate comments and limits messages from strangers. Although Instagram requires users to be 13 or older, adults can manage accounts for younger children—as long as it is clearly stated in the bio.
To prevent spam and impersonation schemes, Meta removed 10 million fake accounts claiming to be popular content creators in the first half of 2025.
This comes as US lawmakers advocate for stricter limits on social media platforms. An act named as Kids Online Safety Act has been reintroduced, with the goal of holding platforms accountable for kid safety.
Information referenced in this article is from CNBC