Published Date : 7/8/2025
Ofcom, the UK’s communications regulator, is pushing forward with a bold initiative to enhance online safety, particularly for children. In its latest move, the organization has proposed a set of measures that focus on implementing highly effective age assurance for livestreams. This comes as part of Ofcom’s broader ‘Year of Action,’ which emphasizes strengthening its Codes of Practice to combat illegal content and harmful behaviors online. n nThe proposal, now open for public consultation, outlines a series of steps to address the growing concerns around livestreaming platforms. These include improving recommender systems to reduce the spread of illegal content, expanding the use of proactive technologies like automated detection of child sexual abuse material (CSAM), deepfakes, and suicide-related content, and introducing stricter age verification processes. The goal is to create a safer digital environment, especially for younger users who may be more vulnerable to online risks. n nOne of the key highlights of the proposal is the emphasis on age verification. Ofcom has stressed that platforms offering livestreaming services must adopt robust age checks to prevent underage users from accessing content that could expose them to grooming or other dangers. This aligns with the regulator’s existing guidelines, which already require providers to take steps to protect children. However, the new measures aim to make these protections more stringent and consistent across the board. n nThe need for such regulations is underscored by the potential for illegal content to spread rapidly, particularly during crises. Ofcom’s blog highlights that events like the violent riots following the Southport murders last year, or the livestreaming of terrorist attacks, can lead to widespread harm. The regulator warns that recommender systems, which prioritize content based on user engagement, can exacerbate these issues by amplifying harmful material. To counter this, Ofcom is urging platforms to develop systems that prioritize safety over virality. n nReal-time moderation is another critical component of the proposal. Ofcom has called for platforms to ensure that human moderators are available 24/7 to review content reported during livestreams. This is especially important in cases where there is a risk of imminent physical harm. The regulator argues that platforms must act swiftly to remove dangerous content and prevent further harm. This requirement is expected to place additional pressure on companies like Twitch, which has already faced scrutiny over its age verification practices. n nThe proposal also touches on the broader issue of digital trust. As more users turn to livestreaming for entertainment, social interaction, and even commerce, the need for reliable age verification becomes increasingly urgent. Ofcom’s guidance on highly effective age assurance is seen as a pivotal step in this direction. By integrating biometric age estimation and other advanced technologies, platforms can better ensure that only adults access age-restricted content, thereby reducing the risk of exploitation. n nThe implications of this proposal extend beyond the UK. With the EU and other regions also grappling with similar challenges, Ofcom’s approach could set a precedent for global online safety standards. The regulator’s commitment to holding platforms accountable is evident in its statement that it will take swift enforcement action if concerns arise. This signals a shift toward a more proactive regulatory environment, where companies are expected to prioritize user safety over profit margins. n nAs the consultation period progresses, stakeholders have until October 2025 to provide feedback. This window allows for a thorough review of the proposal’s feasibility and potential impact. Ofcom has also hinted at further updates, with the formal enforcement of age assurance requirements set to begin on July 25, 2025. This timeline underscores the urgency of the issue and the need for immediate action. n nThe debate around age verification has been a contentious one, with some critics arguing that it may infringe on user privacy. However, Ofcom maintains that the benefits of protecting children from online harms far outweigh the risks. The regulator’s emphasis on transparency and user consent is a key part of its strategy, ensuring that age verification processes are both effective and ethical. n nIn conclusion, Ofcom’s new proposal represents a significant step forward in the fight against online harms. By focusing on age assurance, real-time moderation, and proactive technologies, the regulator is setting a new benchmark for online safety. As the digital landscape continues to evolve, such measures will be crucial in ensuring that the internet remains a safe and inclusive space for all users.
Q: What is Ofcom's new proposal for livestreams?
A: Ofcom's new proposal aims to implement highly effective age assurance measures for livestreams to protect children from online harms. It includes stricter content moderation, advanced technologies, and real-time monitoring to curb illegal activities.
Q: Why is age assurance important for livestreams?
A: Age assurance is crucial to prevent underage users from accessing harmful content, such as grooming or illegal material. It ensures platforms take proactive steps to safeguard children and reduce the risk of exploitation.
Q: How will the new rules affect platforms like Twitch?
A: Platforms like Twitch will need to adopt robust age verification systems and ensure real-time moderation. This may involve integrating biometric technologies and increasing human oversight to comply with Ofcom's guidelines.
Q: What are the consequences of non-compliance with Ofcom's rules?
A: Non-compliance could result in enforcement actions, including fines or other penalties. Ofcom has emphasized its commitment to holding platforms accountable for failing to protect users from online harms.
Q: When will the new regulations take effect?
A: The formal enforcement of age assurance requirements is set for July 25, 2025. However, stakeholders have until October 2025 to provide feedback on the proposal during the consultation period.