Australia to be world’s first nation to ban social media for children
The policy, approved by lawmakers last year, is set to come into force on Wednesday. Companies that fail to follow the new requirements could face penalties reaching as high as $33 million.
“From 10 December 2025, age-restricted social media platforms will have to take reasonable steps to prevent Australians under the age of 16 from creating or keeping an account,” the government said, describing the initiative as a way to safeguard children “at a critical stage of their development.”
Under the new rules, platforms must rely on a variety of signals—such as user behavior, viewing patterns, and profile photos—to identify underage individuals. They are also obligated to prevent minors from bypassing age limits with fake identification documents, AI-created imagery, deepfakes, or VPNs.
Major tech firms have pushed back against the legislation, characterizing it as “vague,” “problematic,” and “rushed.” TikTok and Meta acknowledged that the rules will be challenging to apply but have committed to meeting the requirements. Meta has already started deleting accounts belonging to users under 16 ahead of the December 10 start date. Snapchat and others have warned that the regulation could drive young users toward “darker corners of the internet,” while Reddit has denounced the law as “legally erroneous” and “arbitrary.”
Several other nations are considering or testing comparable policies aimed at shielding children online. The European Parliament passed a non-binding resolution in November recommending a minimum age of 16 for social media access to ensure “age-appropriate online engagement.” Denmark has put forward a proposal to bar users under 15, while France, Spain, Italy, Denmark, and Greece are jointly piloting an age-verification application. Malaysia has also announced plans to implement a similar ban by 2026.
Meanwhile, Russia recently blocked access to Roblox—an online gaming platform widely used by children—citing the dissemination of extremist materials and LGBTQ content.
Concerns about child safety on digital platforms have increased pressure on tech companies globally. In the US, Meta is currently defending itself against lawsuits alleging that it allowed prohibited material to persist on its platforms, including cases involving adult strangers contacting minors, as well as issues linked to suicide, eating disorders, and child sexual exploitation.
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.