Instagram teen accounts will now be automatically guided by a PG-13 movie rated system in what Meta says is its strongest push yet to make the platform safer for young people.

The update places users under 18 years old into stricter default settings that limit exposure to adult, drug-related, and suggestive content, needing parental permission to opt out.

“We want teens to have a safer experience online by automatically limiting what they see,” Meta’s Regional Policy Director for Australia and New Zealand Mia Garlick says.

“These updates are designed to align with PG-13 standards and give parents more control.”

Ms Garlick said the new system would use age-prediction technology to detect teens who falsely register as adults. “We recognise that no system is perfect,” she said, “but we hope this reassures parents that we’re working to show teens safe, age-appropriate content on Instagram by default.”

The company says the changes build on previous teen-account protections and will also block searches for a wider range of mature keywords and prevent teens from following accounts that regularly share inappropriate material.

However, not everyone is convinced the move will make a big difference.

Perth-based media analyst Ming Johansson said Meta’s update won’t work.

“I think this won’t be implemented very well,” she said, and warned it might create a “false sense of security” for parents.

“Meta has a history of manipulating feeds and experimenting on users,” Johansson said.

“Kids and teenagers are incredibly innovative with workarounds, if you tell them not to do something, they’ll find a way to do it.”

She added that lasting change would require education, not just algorithms.

“Teaching kids to be discerning and understand bias online is far more effective than banket restrictions.”

The Youth Affairs Council of Western Australia (YACWA) welcomed Meta’s focus on youth safety but questioned its overall impact, citing research that found only 17% of Meta’s earlier safety tools were fully functional (Béjar, 2025).

“While YACWA supports digital safety measures to reduce online harms, we remain sceptical whether these reforms alone will effective,” a spokesperson said.

YACWA has called for the government to introduce stronger online duty of care regulations that embed young people’s voices in future digital-safety frameworks.

Meta says it will continue to refine its teen-account protections and expand its safety tools across platforms later this year.