Published Date : 8/8/2025Â
The question of how Donald Trump’s administration will affect digital identity outside the U.S. has been a significant topic of discussion in Europe and the UK. Fears that online safety laws could be used as bargaining chips in trade negotiations appear to be well-founded, with Reuters reporting that the administration has instructed its diplomats in Europe to launch a lobbying campaign against the EU’s Digital Services Act (DSA).
Issued as a so-called “action request,” the cable signed by U.S. Secretary of State Marco Rubio says the DSA imposes “undue” restrictions on freedom of expression. It directs the nation’s ambassadors in Europe to convey U.S. concerns about the DSA and the financial costs for U.S. companies.
“Posts should meet with government officials, businesses, civil society, and impacted individuals to report on censorship cases, including but not limited to those related to the DSA.” Moreover, U.S. diplomats should investigate “any government efforts to suppress protected forms of expression or coerce private companies to do the same,” with priority given to incidents that impact U.S. citizens and companies.
The campaign’s stated objective is to “focus efforts to build host government and other stakeholder support to repeal and/or amend the DSA or related EU or national laws restricting expression online.” It provides specific suggestions to U.S. diplomats on how the EU law may be changed, and talking points to help them make that argument. These include notes on a reworking of how the DSA defines “illegal content,” notably to prevent it from stifling “political and religious discourse;” the potential withdrawal of the DSA’s Code of Conduct on Disinformation framework; and removing or reducing fines for noncompliance.
There is a bitter irony in the U.S. accusing the EU of overreach with its online safety legislation, while running a campaign to export a Trumpified version of America’s First Amendment-branded idea of free speech to sovereign nations with different cultures and laws, using economic and diplomatic pressure. The stage has been set since shortly after Trump’s inauguration. In February, U.S. Vice President J.D. Vance accused EU leaders of censoring the speech of groups including Germany’s AfD party, which espouses an ethnic nationalist ideology.
Following that, the chairman of the U.S. Federal Communications Commission (FCC) said the DSA was “not compatible with America’s free speech tradition.” In May, Rubio raised the possibility of denying visas to those who they deem to be censoring Americans’ free speech, including on social media. Unsurprisingly, the administration has found support in its efforts from Meta, X, and other Silicon Valley social media giants that have amassed huge international user bases and have used their platforms to wield political influence.
The European Commission rejects the notion that the DSA amounts to censorship, calling such claims “completely unfounded.” The EU’s antitrust body has assured U.S. lawmakers that the DSA does not target U.S. companies and that Europe is committed to keeping digital markets open. These are at best tepid responses to a development with overarching implications, for both the tech world and beyond. Critique is one thing, but an orchestrated action to mold the world’s laws into the shape of U.S. policy under Donald Trump should raise loud alarm bells for legislators, small to medium-sized tech firms in the biometrics and digital ID space, and just about everyone else.
There are legitimate critiques to be made of the DSA. An analysis in Euractiv offers one targeting the EU’s age assurance laws. The DSA does not proscribe the use of specific age verification or age estimation technologies. Instead, it requires that any method must comply with the EU’s privacy and data security guidelines, and offers a software blueprint for entities to build and customize for specific jurisdictions and age assurance use cases. The European Commission’s prototype for a tokenized system is built around privacy-by-design, a core principle of the General Data Protection Regulation (GDPR). It is aiming to further tighten data sharing restrictions through the adoption of zero knowledge proofs (ZKPs).
Still, the EU model “isn’t without issues,” noting the potential to limit kids’ access to more of the internet than is expressly intended. It quotes privacy rights collective EDRi, which says “there is a real risk that age-based exclusion will be prioritized over the harder and more structural work of making online environments safer.”
Another critique, posted on the blog DSA Observatory, says the EU age assurance law risks “oversimplifying a complex regulatory trade-off.” The guidelines overstate the proportionality of age verification and age estimation methods, sideline key data protection concerns, and miss the opportunity to articulate the implications of a rights-based, privacy-preserving design for all users. The author, Sophie Stalla-Bourdillon of the Brussels Privacy Hub, identifies three key concerns.
First, age assurance features are subject to data protection by design and by default requirements. Whether an age assurance method is appropriate and proportionate – i.e., whether it ensures a high level of privacy, safety, and security for minors which could not otherwise be achieved through less intrusive means – the Commission’s approach is weakened by broad, generic assertions about both age verification and biometric age estimation. A key consideration is whether the age assurance method adheres to the data minimization principle and complies with data protection by design and default under Article 25 GDPR, which necessitates a case-by-case assessment and careful examination of the implementation details. Since the scope of online services that would be obligated to implement age assurance methods is so broad, it is problematic to make generic statements about the appropriateness and proportionality of age assurance methods without examining the specifics of their implementation.
The second concern asserts that biometric age verification prototypes are not necessarily real-life solutions. Here, the author takes aim at the rollout of age assurance laws and tech, which has seen the EU try and fill in the time before the official launch of the EU Digital Identity (EUDI) Wallet scheme in 2026 with the “mini wallet” or white label app for tokenized age verification. In essence, Stalla-Bourdillon argues that the Commission’s solution cannot be both a benchmark and a work in progress, even if it is planning to adopt ZKPs. The technical specifications, she says, are simply not yet ready.
The third observation is that protecting all users’ rights makes it easier to protect children’s rights. Although recommended by the EC as a key safeguard for minors, a strict application of Article 25 GDPR should require privacy-by-default profile settings for both children and adults. Both adults and children should not be exposed to manipulative or exploitative design features. The law, in effect, is for everyone, not just children. Balancing fundamental rights, including balancing the various fundamental rights of children and balancing the fundamental rights of children in relation to those of other users, cannot be easily automated. This necessarily leads to trade-offs that are difficult to prioritize or rank. The upshot is that the European Commission’s guidelines do not clearly articulate the implications of the interplay between the DSA and the GDPR. Without a clearer delineation of scope and a recognition of inherent trade-offs, the guidelines may fall short of effectively balancing the fundamental rights at stake.Â
Q: What is the Digital Services Act (DSA)?
A: The Digital Services Act (DSA) is a set of EU regulations designed to govern the responsibilities and liabilities of digital service providers, including online platforms, to ensure a safer and more transparent digital environment.
Q: Why is the US lobbying against the DSA?
A: The US administration claims that the DSA imposes undue restrictions on freedom of expression and poses financial costs for US companies. They are lobbying to repeal or amend the DSA to align with US free speech traditions.
Q: What are the key concerns with the EU's age assurance laws?
A: Critics argue that the EU's age assurance laws may oversimplify the regulatory trade-offs, overstate the proportionality of age verification methods, and sideline key data protection concerns. There are also concerns about the technical readiness and the potential to limit kids' access to more of the internet than intended.
Q: How does the EU respond to US criticisms of the DSA?
A: The European Commission rejects the notion that the DSA amounts to censorship, stating that such claims are unfounded. They assure that the DSA does not target US companies and is committed to keeping digital markets open.
Q: What is the role of privacy-by-design in the EU's age assurance model?
A: Privacy-by-design is a core principle of the General Data Protection Regulation (GDPR) and is central to the EU's age assurance model. It requires that any age verification or estimation method must comply with privacy and data protection guidelines to ensure a high level of privacy for users.Â