Meta has begun removing Australian children under the age of 16 from Instagram, Facebook and Threads, moving ahead of a nationwide teen social media ban that takes effect next week. The company began sending alerts last month to users identified as being between 13 and 15, notifying them that their accounts would start closing from 4 December.
Authorities estimate that about 150,000 Facebook profiles and roughly 350,000 Instagram accounts will be caught up in the move. Threads is also included since it requires an Instagram account to function. Australia will introduce its world first prohibition on under 16s using major social platforms on 10 December. Under the new rules, companies must take reasonable steps to prevent these users from having accounts or risk fines reaching A$49.5 million. A Meta spokesperson told the BBC that the company intends to follow the law but emphasised that meeting the requirements will involve an ongoing and layered process.
Meta Bans Australian Children From Instagram and Facebook Amid Safety Shift

Meta argues that a standard system for age verification should be placed on app stores. According to the company, verifying age at the point of download would simplify the process for both families and platforms and reduce the need for Australian children and teenagers to confirm their age on multiple apps.
Meta has informed young users identified as under 16 that they can save their posts, videos and messages before their accounts are shut down. Those who believe they were wrongly flagged can request a review by submitting a video selfie for age analysis or providing official identification such as a driver’s licence.
While Meta’s platforms are the most visible part of the crackdown, the ban also applies to YouTube, X, TikTok, Snapchat, Reddit, Kick and Twitch. Government officials say the purpose of the legislation is to shield Australian children from harmful online experiences. Critics warn that the measure could isolate young people who rely on social platforms to maintain friendships or push them toward less regulated spaces on the internet.
Communications Minister Anika Wells acknowledged that the early stages of enforcement may come with challenges but stressed that the priority is protecting Generation Alpha, defined as those under 15, along with future generations. She described today’s youth as being constantly exposed to addictive algorithms and compared the effect to a dopamine feed triggered as soon as they receive a smartphone and join a platform.
Authorities are also watching emerging platforms such as Lemon8 and Yope amid concerns that young users may migrate there. The eSafety Commissioner recently asked both apps to determine whether they fall under the scope of the ban. Yope’s chief executive said the service operates more like a private messenger and does not fit the definition of a social platform.
YouTube, which was initially excluded but later added to the ban, criticised the law, arguing that removing accounts for Australian children eliminates built in parental controls and may unintentionally make the platform less safe.
The policy follows a government study showing that 96 percent of Australian children aged 10 to 15 use social media. The research found high exposure to harmful content including violent material, misogyny, pro eating disorder posts and self harm content. The report also revealed significant risks such as grooming attempts and widespread cyberbullying.

Also Check:
- EU Moves to Freeze Russian Sovereign Assets Indefinitely as Tensions with Moscow Escalate
- Slovakia Chalk Revolution: How One Teenager’s Protest Sparked a Nationwide Movement
- Ukraine’s Zelenskyy calls for collective pressure after Russia hits cities again
- Russia Launches New Strikes on Ukraine Amid US Optimism in Negotiations
- Loganair Applies For New Licence To Operate Key Inter Island Route
Source: BBC