
Fallos sobre Redes Sociales 2026: California y Nuevo México
April 12, 2026- 65 Square platform
- age verification technology
- AI content moderation safety
- Australia social media law
- child online protection
- digital safety policy
- digital wellbeing teens
- eSafety Commissioner Australia
- online grooming prevention
- online safety for youth
- platform accountability
- safer social media platforms
- social media age restriction
- social media harms youth
- social media legislation global trends
- social media privacy concerns
- social media regulation Australia
- social media workarounds teens
- teen online behaviour
- teen social media ban
- under 16 social media ban
- VPN teen usage
- youth internet protection
- youth online safety innovation
Social Media Ban Tested -
When Australia passed the Online Safety Amendment (Social Media Minimum Age) Act 2024, it aimed to push social media platforms to keep under-16s off their services.
Instead of penalizing teens, the law targets companies like Meta, Snap, and Google, requiring them to verify users’ ages or face major fines. Platforms have introduced facial estimation, behaviour tracking, and ID checks, but enforcement remains uneven—and many teens are still active.
As one teenager put it:
“We just want to stay in contact with our friends.”
Another added:
“A ban won’t fix the issues they think it will.”
Workarounds are everywhere. Teens are using fake birthdates, borrowing accounts, or shifting to smaller platforms and private groups. The result isn’t less social media use—it’s different, often harder-to-monitor use.
At the same time, new platforms are emerging with a different approach. One example is 65square, which is positioning itself as a safety-first alternative. After each user is verified with a passport, facial analysis happens at every login and kids get grouped together in different age brackets for maturing age appropriate experiences. Adults can’t message or see kids and the same is true for children in different age brackets.
The goal is not just to restrict access, but to actively protect vulnerable users while they are online while giving them freedom to be themselves.
This reflects a broader shift in thinking. While Australia’s law tries to reduce harm by limiting access, platforms like 65square are attempting to reduce harm within the experience itself.
The contrast is important. Age bans create friction, but they don’t eliminate risk. Safety-focused design, on the other hand, assumes young people will still be online—and tries to make those spaces safer.
Australia’s policy has forced platforms to act, but it hasn’t fully achieved its core goal. Teens are still online. They’re just navigating the system differently.
References
- Online Safety Amendment (Social Media Minimum Age) Act 2024
- eSafety Commissioner
- Reuters: Australia social media law implementation (2025)
- The Guardian: Teen experiences under the ban (2026)
- The Washington Post: Youth reactions and workarounds
- Associated Press: Global response and enforcement challenges


