Published Date : 10/24/2025Â
Everyone’s invited. That’s the message to Big Tech from Australian eSafety Commissioner Julie Inman-Grant. But it’s no party; the commissioner is warning Meta, TikTok, Snapchat, X, and YouTube that they will likely be subject to the law requiring platforms to adopt age assurance measures, such as biometric age estimation, to ensure users under 16 do not create accounts.
Inman-Grant has sent a letter to each of the five companies, reflecting her preliminary assessment of whether or not the major social platforms they run will be covered by the law. A report in MLex says the commissioner’s decision hinges on the platforms’ sole purpose being online social interaction – i.e., they are covered by the social media law because they are social media platforms.
Some have tried to argue that’s not the case. Snapchat has claimed that it’s primarily a messaging service, more like WhatsApp than Instagram. And YouTube has objected to having its initial exception revoked. So-called carve-outs now only exempt platforms that are explicitly designed for messaging and educational purposes, including WhatsApp and Messenger, YouTube Kids, and Google Classroom.
Inman-Grant says she will continue to engage with platforms on their status vis-a-vis the law. That may be a polite way of acknowledging that, while platforms will likely do what they can to comply when they have to, legal challenges are almost certainly forthcoming. It also reiterates the commissioner’s promise to maintain a “dynamic list” of services, which will change as they comply – or don’t. The list will take into account the expectation that “some under-16 users will migrate to other platforms.”
Over the coming weeks, eSafety will have more to say about the platforms it considers must comply with the minimum age obligations, says a statement on the eSafety Commissioner’s website. There is speculation that OpenAI’s Sora will be added to the list.
Chatbot providers get served with stern letters. Inman-Grant has already fired warning shots in the direction of four AI chatbot providers, in the form of legal notices asking them to explain how they are protecting children from “exposure to a range of harms, including sexually explicit conversations and images and suicidal ideation and self-harm.”
“There can be a darker side to some of these services with many of these chatbots capable of engaging in sexually explicit conversations with minors,” says Inman-Grant. “Concerns have been raised that they may also encourage suicide, self-harm, and disordered eating.” A release from eSafety says notices were given to Character Technologies, Inc. (character.ai), Glimpse.AI (Nomi), Chai Research Corp (Chai), and Chub AI Inc. (Chub.ai). Each must outline how they are complying with the Government’s Basic Online Safety Expectations Determination.
“We are asking them about what measures they have in place to protect children from these very serious harms. I do not want Australian children and young people serving as casualties of powerful technologies thrust onto the market without guardrails and without regard for their safety and wellbeing.”
The commissioner says she won’t hesitate to resort to punitive measures for platforms that don’t comply. Enforcement action for failing to file a reporting notice could include court proceedings and fines of up to 825,000 Australian dollars per day (about 535,397 dollars U.S.). Breach of a direction to comply may result in civil penalties of up to 49.5 million (32.2 million U.S.).
Reddit age verification under scrutiny from ICO. Fines are already flying in the UK, where Reddit faces a potential censure from the national data protection regulator over the platform’s age verification practices, some of which are provided by Persona. MLex says the investigation concerns the platforms’ compliance with the Information Commissioner’s Office (ICO)’s Age Appropriate Design Code, or Children’s Code.
Reddit is reportedly cooperating with the ICO. But it will not be the only platform to face questions from the regulator about whether its age assurance measures put user privacy at risk. An investigation into TikTok continues, as does one into Imgur – an image-sharing platform that, rather than comply with UK laws, decamped from the country altogether. In the ICO’s view, “exiting the market does not absolve a company of responsibility for prior infringements.”Â
Q: What is the new law requiring social media platforms to do?
A: The new law requires social media platforms to adopt age assurance measures, such as biometric age estimation, to ensure users under 16 do not create accounts.
Q: Which platforms are likely to be covered by the new law?
A: The platforms likely to be covered by the new law include Meta, TikTok, Snapchat, X, and YouTube.
Q: What are the exceptions to the new law?
A: Platforms that are explicitly designed for messaging and educational purposes, such as WhatsApp, Messenger, YouTube Kids, and Google Classroom, are exempted from the new law.
Q: What are the penalties for non-compliance with the new law?
A: Enforcement action for failing to file a reporting notice could include court proceedings and fines of up to 825,000 Australian dollars per day. Breach of a direction to comply may result in civil penalties of up to 49.5 million Australian dollars.
Q: What is the ICO's stance on platforms that exit the market to avoid compliance?
A: The ICO’s view is that exiting the market does not absolve a company of responsibility for prior infringements.Â