Reddit Challenges Australia Under-16 Social Media Ban in Court
|
General Studies Paper III: IT & Computers, Social Media |
Why in News?
Recently, Reddit challenged Australia’s under-16 social media ban in Australia’s High Court. The platform argues that the new law is ineffective and risks limiting essential political discussion online, raising broader questions about regulation and free speech.
Australia’s Under-16 Social Media Ban
- Origin: Australia passed a unique law called the Online Safety Amendment (Social Media Minimum Age) Act 2024 on 29 November 2024. The law is an amendment to the existing Online Safety Act 2021. Its core objective is to prevent Australians under 16 years of age from holding accounts on major social media platforms. This law is the first of its kind in the world.
- Objectives: Australian authorities describe the law as a major step in protecting children. The government argues that early exposure to social media can negatively affect young people’s mental health, attention, and well-being. It claims that by delaying access, children are more likely to develop offline skills and resilience. The ban would support healthier childhood experiences and reduce pressures faced by young people online.
- Provisions: The Under-16 Social Media Ban law was passed in late 2024, its enforcement began on 10 December 2025. From that date, platforms must act to deactivate under-16 accounts and block new ones until users turn 16.
- Under the new legal framework, social media companies must verify the age of users to ensure that no one under 16 can have an account on their service. These companies must take “reasonable steps” to prevent age-restricted users from accessing these platforms while logged in.
-
-
- The law defines “age-restricted social media platform” as any digital service whose main purpose includes social interaction between people online. Services that can allow users to post material, link to or interact with others fall under this definition.
- Included platforms currently cover major services such as Facebook, Instagram, Snapchat, TikTok, X (formerly Twitter), Reddit, Twitch, Threads, YouTube and Kick. These platforms must act to ensure that Australians under 16 do not have accounts.
-
- Platforms face civil penalties if they fail to take these steps. Penalties can amount to up to A$49.5 million (about US$32 million) per breach. The law also allows courts to impose fines for non-compliance.
-
- The burden of compliance lies with the platforms themselves and not with parents or minors. The law does not criminalise under-16s for logging in. Penalties target companies that do not implement the measures.
Reddit’s Legal Grounds and Core Arguments Against the Law
- Reddit filed a legal challenge in the High Court of Australia on 12 December 2025 against the new minimum age law under the Online Safety Amendment (Social Media Minimum Age) Act 2024. The company asked the court to review and declare the law invalid.
- Reddit argued that the law is constitutionally flawed because it limits what the company calls an implied freedom of political communication in Australia. This implied freedom is a constitutional principle recognized by Australia’s High Court as part of the democratic system. Reddit claims the ban unduly restricts how young people can access and participate in political discourse online.
- The company says the law goes beyond child safety and affects broader rights of expression for those under 16. It highlighted that children approaching voting age may use social platforms to inform themselves about civic issues. Restricting that access, Reddit argues, may weaken democratic participation.
- One core argument from Reddit is that the law incorrectly classifies its platform as an “age-restricted social media platform.” Reddit says its service does not fit the legislature’s definition in the same way as other social networks. It states the platform’s main activity is topic-based discussions rather than real-time social networking.
- The platform argued that this misclassification imposes unnecessary compliance obligations. Reddit claimed that this could distort how the law applies across different services, creating an illogical patchwork of covered and excluded platforms.
- Reddit claimed the law isolates teens from “age-appropriate community experiences.” The platform said many of its forums serve educational, informational, and social purposes that go beyond typical social networking. Reddit pointed to existing moderation tools and content filters as ways to reduce harm without imposing a blanket age ban.
- In court filings, Reddit requested that the High Court either declare the law invalid or that it be excluded from the list of age-restricted social media platforms. It did not argue that it should be completely exempt from child safety regulations, but that the specific application of the minimum age law to Reddit is inappropriate.
Global Precedents to Youth Social Media Access
- United Kingdom: The United Kingdom passed the Online Safety Act 2023 to protect children online. The law requires platforms to assess risks to children. The law asks services to use age verification or age estimation to prevent underage access where needed. The law also mandates an age appropriate design for services that reach children. Regulators can fine platforms that fail to act.
- European Union: The European Union adopted the Digital Services Act to increase safety and accountability. The law requires platforms to reduce risks to minors. The law bans targeted advertising to children. The law asks platforms to adopt measures for enhanced protection of minors. The EU also enforces child consent rules under the GDPR where default consent ages run from 13 to 16 in member states.
- United States: The United States relies on the Children’s Online Privacy Protection Act (COPPA) to protect children under 13. The law requires parental consent before platforms collect data from young children. The U.S. approach focuses on privacy and data collection rather than a blanket minimum age for accounts. The U.S. has also seen enforcement actions and ongoing litigation about platforms that fail to verify ages.
- India: India regulates platforms under the IT Rules 2021 and related guidelines. The rules place due diligence duties on intermediaries. India’s DPDP Act 2023 creates a strict regime for children’s data (under 18), demanding verifiable parental consent, banning harmful processing like targeted ads/tracking, and shifting to a risk-based approach with potential exemptions for “verifiably safe” platforms or essential services (like health/education), balancing strong protection with practical needs, unlike older consent-only models.
What’s Next?
A preliminary High Court hearing is set for February 2026 to establish dates for the full challenge, brought by Reddit and a digital rights group (Digital Freedom Project) representing two 15-year-olds. The outcome will shape the future of youth online access in Australia, balancing protection with fundamental rights.
|
Also Read: Social Media Regulation for Children |

