top of page

UN Warns Social Media Bans for Children Could Backfire as Australia Blocks Platforms for Under-16s

ree

Children across Australia woke up this week to an altered digital landscape. After months of political debate and public scrutiny, the government’s new under-16 social media ban abruptly locked millions of young users out of platforms including TikTok, Instagram, and YouTube, according to multiple media reports.

The move, meant as a sweeping effort to curb online harms, represents one of the most direct regulatory interventions in the world targeting children’s access to digital platforms. But within hours of the ban taking effect, global child-rights experts—including the United Nations—warned that the policy could produce serious unintended consequences.



Aims: Reducing Cyberbullying, Exploitation, and Harmful Content


Officials have defended the ban as a necessary safeguard against rising rates of cyberbullying, sexual exploitation, predatory behavior, and exposure to harmful content that the government argues have contributed to a mental-health crisis among young people.

The legislation reflects a growing trend among governments exploring hard restrictions on children’s digital use. But experts say that while the intention is understandable, the execution may be flawed.



UNICEF: Age Bans Alone “Won’t Keep Children Safe”


The UN Children’s Fund (UNICEF) responded swiftly, praising governments for taking online risks seriously but warning that blanket age restrictions carry significant dangers.


“Social media bans come with their own risks, and they may even backfire,” UNICEF said in a statement.

For many children—particularly those who are isolated, marginalized, or living in unsafe offline environments—social media represents far more than entertainment. It is often a primary channel for learning, connection, play, and self-expression.


UNICEF cautioned that bans do not stop usage; they simply push it elsewhere.


Children may:


  • Use workarounds or VPNs

  • Borrow shared devices

  • Migrate to less regulated, more dangerous platforms


This, the agency warned, “will only make it harder to protect them.”



UN Rights Chief: Tech Platforms Were Never Built with Children’s Rights in Mind


At his end-of-year press conference in Geneva, Volker Türk, the UN High Commissioner for Human Rights, echoed UNICEF’s concerns.


“We know how difficult it is for societies to grapple with the issue of how to keep children safe online,” he said.“We have had social media platforms for many years, but I don't think at the stage when they were launched that a human rights due-impact assessment was actually done.”

Türk noted that Australia is not alone: the U.S. state of California has adopted a similar approach, and the European Union is currently debating draft legislation aimed at protecting minors online.



A Broader Strategy Is Needed, UN Says


UNICEF emphasized that age restrictions cannot replace a comprehensive digital-safety strategy.


“Age restrictions must be part of a broader approach that protects children from harm, respects their rights to privacy and participation, and avoids pushing them into unregulated, less safe spaces,” the agency stated.

Key points from the UN:


  • Regulation cannot substitute for platform responsibility. Companies must invest in safer product design, child-sensitive algorithms, and robust content moderation.

  • Governments, families, and tech companies must work together, centering children’s rights in policy design.

  • Platforms should be redesigned with safety and well-being at the core, not added as an afterthought.

  • Regulators must enforce systemic measures to prevent and mitigate online harm—not just restrict access.



Parents Are Being Asked to Do the Impossible


UNICEF also highlighted the untenable burden currently placed on parents and caregivers.


“They have a crucial role but are being asked to do the impossible: monitor platforms they didn’t design, police algorithms they can’t see, and manage dozens of apps around the clock.”

Supporting families through digital-literacy education, guidance, and transparent platform tools is essential, the agency said.



Monitoring What Works


As more governments consider adopting similar restrictions, the UN says policymakers must closely evaluate the effectiveness—and consequences—of these measures.


“It’s very important to keep monitoring what works, what doesn’t work,” Türk said, adding that the best interests of the child must guide every regulatory step.

The debate now underway in Australia will shape how countries worldwide respond to the rapidly evolving challenge of safeguarding children online—balancing protection with autonomy, safety with access, and regulation with rights.

Top Stories

  • Instagram
  • Facebook
  • Twitter

ONEST Network, LLC
1000 Brickell Ave, Ste 715 PMB 333

Miami, FL 33131

 

© 2025 by ONEST Network, LLC. All rights reserved.

bottom of page