CRBC News

Australia to Block Under-16s From Major Social Platforms From Dec. 10 — What Parents and Platforms Need to Know

From December 10, Australia will require major social platforms to block users under 16 from creating accounts, a move watched by regulators around the world. Platforms must identify and remove underage profiles; Meta says it is already deactivating some accounts and allows age checks via a video selfie or government ID. Covered services include Facebook, Instagram, Snapchat, TikTok, Kick, Twitch and YouTube; Roblox, Pinterest and WhatsApp are currently exempt but may be reviewed. Firms that do not take "reasonable steps" to comply risk fines up to $32 million.

Australia to Block Under-16s From Major Social Platforms From Dec. 10 — What Parents and Platforms Need to Know

Australia will introduce a world-first rule on 10 December requiring major social platforms to prevent users under 16 from registering accounts. The policy — watched closely by regulators worldwide — aims to reduce harms to young people online, but it raises practical questions about verification, coverage and enforcement.

How the rule will work

From December 10, platforms covered by the law must remove accounts belonging to Australian users under 16. Not every user will be asked to prove their age: only those suspected of breaching the restriction will be flagged for verification. Young people will still be able to view some content in a read-only mode on exempt parts of services, but they will not be able to create their own profiles on platforms that fall under the rules.

Verification methods

Platforms are responsible for identifying and removing underage accounts. Canberra has not mandated a single verification method, and various trials of different approaches have been run. Meta has begun deactivating accounts based on the age provided at account creation and says people incorrectly flagged can verify their age using a video selfie or by submitting government-issued ID.

Which services are covered

The list of covered services is still evolving. The ban currently includes Facebook, Instagram, Snapchat, TikTok and streaming platforms such as Kick and Twitch. YouTube was added to the list despite earlier suggestions it might be exempt so children could continue to access online lessons. Other popular services — including Roblox, Pinterest and WhatsApp — are presently exempt but may be reviewed later.

Expect attempts to bypass the rules

Officials expect determined teenagers to try to skirt the restrictions. Guidance warns of tactics such as submitting forged identity documents, using AI to alter photos, or creating accounts tied to older users. Platforms are expected to develop technical and policy countermeasures, but the internet safety regulator acknowledges that "no solution is likely to be 100 percent effective all of the time."

Penalties and enforcement

Regulators accept the rollout will be imperfect and that some underage accounts may remain while systems are refined. However, companies that fail to take what the regulator considers "reasonable steps" to comply could face fines up to $32 million. The exact interpretation of "reasonable steps" and how it will be enforced remains unclear and will be defined through guidance and future enforcement actions.

What this means for families and platforms

Hundreds of thousands of Australian teenagers are expected to be affected; Meta reports roughly 350,000 Instagram users aged 13–15 in Australia. Parents and educators should prepare to discuss safe online behaviour and alternatives for social interaction and learning. Platforms must balance user privacy, verification accuracy and child safety as they implement technical solutions ahead of the deadline.

Bottom line: The new rules mark a major experiment in online age verification. They aim to protect younger teens but will test the limits of verification technology, privacy safeguards and regulatory enforcement.

Similar Articles