New Safety Measures for Teenagers on Instagram
Teenagers using Instagram will soon have access to PG-13 content by default, and they won’t be able to change their settings. This new policy was announced by Meta, the parent company of Instagram. According to a blog post, this change includes hiding or not recommending posts that contain strong language, certain risky stunts, and content that might encourage harmful behaviors, such as posts showing marijuana paraphernalia.
Currently, users under 18 are automatically placed into a restrictive teenage account unless a parent or guardian allows them to opt out. These accounts are private by default and have usage restrictions. They also filter out more ‘sensitive’ content, such as posts promoting cosmetic procedures.
However, many teenagers often lie about their age when signing up for social media. Although Meta has started using artificial intelligence to detect these cases, the company hasn’t confirmed how many adult accounts are actually used by minors.
To further enhance online safety for children, Meta is introducing an even stricter setting that parents can set up for their children. As part of its efforts to add safeguards for younger users, Meta has promised not to show inappropriate content to teenagers, such as posts about self-harm, eating disorders, or suicide. However, this hasn’t always been effective.
A recent report found that teen accounts created by researchers were recommended age-inappropriate sexual content, including ‘graphic sexual descriptions, the use of cartoons to describe demeaning sexual acts, and brief displays of nudity.’ Additionally, Instagram recommended a ‘range of self-harm, self-injury, and body image content’ on teenage accounts. The report states that these contents would be reasonably likely to result in adverse impacts for young people, including poor mental health, self-harm, and suicidal ideation and behaviors.
Meta criticized the report as ‘misleading, dangerously speculative,’ claiming it misrepresents its efforts on teenage safety. Josh Golin, executive director of the Fairplay organization, expressed skepticism about the implementation of Meta’s new plan. He stated that these announcements are primarily about forestalling legislation that Meta doesn’t want to see and reassuring concerned parents. He emphasized that ‘splashy press releases won’t keep kids safe, but real accountability and transparency will.’
Meta already blocks certain search terms related to sensitive topics such as suicide and eating disorders. The latest update will expand this to a broader range of terms, such as ‘alcohol’ and ‘gore.’ The PG-13 update will also apply to artificial intelligence chats and experiences targeted at teenagers.
For parents who want an even stricter setting for their kids, Meta is launching a ‘limited content’ restriction. This will block more content and remove teens’ ability to see, leave, or receive comments under posts.
Other countries are taking steps to ban social media for under-16s entirely. In Denmark, children under 16 will be banned from using social media without parental permission, as announced by the country’s prime minister. The Scandinavian country follows Australia and Norway in restricting sites like Facebook, TikTok, and Instagram for those 15 and under.



