Published Date : 8/8/2025Â
The question of how aspiring autocrat Donald Trump’s administration will affect digital identity outside the U.S. has been a significant concern in Europe and the UK. Fears that online safety laws could be used as bargaining chips in trade negotiations appear well-founded. According to Reuters, the U.S. administration has directed its diplomats in Europe to launch a lobbying campaign against the EU’s Digital Services Act (DSA).
Issued as an “action request,” the cable signed by U.S. Secretary of State Marco Rubio states that the DSA imposes “undue” restrictions on freedom of expression. It instructs the nation’s ambassadors in Europe to convey U.S. concerns about the DSA and the financial costs for U.S. companies. The cable also directs diplomats to meet with government officials, businesses, civil society, and impacted individuals to report on censorship cases, including those related to the DSA. U.S. diplomats are to investigate any government efforts to suppress protected forms of expression or coerce private companies to do the same, with priority given to incidents affecting U.S. citizens and companies.
The campaign’s stated objective is to focus efforts on building host government and other stakeholder support to repeal or amend the DSA or related EU or national laws restricting online expression. The cable provides specific suggestions to U.S. diplomats on how the EU law may be changed, along with talking points to help them make that argument. These include notes on reworking how the DSA defines “illegal content” to prevent it from stifling political and religious discourse, the potential withdrawal of the DSA’s Code of Conduct on Disinformation framework, and removing or reducing fines for noncompliance.
There is a bitter irony in the U.S. accusing the EU of overreach with its online safety legislation while simultaneously running a campaign to export a Trumpified version of America’s First Amendment-branded idea of free speech to sovereign nations with different cultures and laws, using economic and diplomatic pressure. The stage was set shortly after Trump’s inauguration. In February, U.S. Vice President J.D. Vance accused EU leaders of censoring the speech of groups, including Germany’s AfD party, which espouses an ethnic nationalist ideology. The chairman of the U.S. Federal Communications Commission (FCC) later said the DSA was “not compatible with America’s free speech tradition.”
In May, Rubio raised the possibility of denying visas to those deemed to be censoring Americans’ free speech, including on social media. Unsurprisingly, the administration has found support in its efforts from Meta, X, and other Silicon Valley social media giants that have amassed huge international user bases and have used their platforms to wield political influence. The European Commission rejects the notion that the DSA amounts to censorship, calling such claims “completely unfounded.” The EU’s antitrust body has assured U.S. lawmakers that the DSA does not target U.S. companies and that Europe is committed to keeping digital markets open.
There are legitimate critiques of the DSA. An analysis in Euractiv targets the EU’s age assurance laws, noting that while the DSA does not prescribe specific age verification or age estimation technologies, it requires that any method must comply with the EU’s privacy and data security guidelines. The European Commission’s prototype for a tokenized system is built around privacy-by-design, a core principle of the General Data Protection Regulation (GDPR). The Commission aims to further tighten data sharing restrictions through the adoption of zero-knowledge proofs (ZKPs).
However, the EU model is not without issues. The critique notes the potential to limit kids’ access to more of the internet than intended. Privacy rights collective EDRi warns that there is a real risk that age-based exclusion will be prioritized over the harder and more structural work of making online environments safer. Another critique, posted on the DSA Observatory blog, argues that the EU age assurance law risks oversimplifying a complex regulatory trade-off. The guidelines overstate the proportionality of age verification and age estimation methods, sideline key data protection concerns, and miss the opportunity to articulate the implications of a rights-based, privacy-preserving design for all users.
The author, Sophie Stalla-Bourdillon of the Brussels Privacy Hub, identifies three key concerns. First, age assurance features are subject to data protection by design and by default requirements. Whether an age assurance method is appropriate and proportionate, ensuring a high level of privacy, safety, and security for minors, is weakened by broad, generic assertions about both age verification and biometric age estimation. Since the scope of online services that would be obligated to implement age assurance methods is so broad, it is problematic to make generic statements about the appropriateness and proportionality of age assurance methods without examining the specifics of their implementation.
Second, biometric age verification prototypes are not necessarily real-life solutions. The EU is trying to fill the gap before the official launch of the EU Digital Identity (EUDI) Wallet scheme in 2026 with a “mini wallet” or white label app for tokenized age verification. Stalla-Bourdillon argues that the Commission’s solution cannot be both a benchmark and a work in progress, even if it is planning to adopt ZKPs. The technical specifications are simply not yet ready.
Third, protecting all users’ rights makes it easier to protect children’s rights. A strict application of Article 25 GDPR should require privacy-by-default profile settings for both children and adults. Both adults and children should not be exposed to manipulative or exploitative design features. The law, in effect, is for everyone, not just children. Balancing fundamental rights, including those of children in relation to other users, cannot be easily automated. This necessarily leads to trade-offs that are difficult to prioritize or rank. The upshot is that the European Commission’s guidelines do not clearly articulate the implications of the interplay between the DSA and the GDPR. Without a clearer delineation of scope and a recognition of inherent trade-offs, the guidelines may fall short of effectively balancing the fundamental rights at stake.Â
Q: What is the Digital Services Act (DSA)?
A: The Digital Services Act (DSA) is a set of regulations proposed by the European Union to govern the digital services and platforms operating within the EU. It aims to ensure a safe and fair digital environment by addressing issues such as illegal content, disinformation, and the protection of users' rights.
Q: Why is the US lobbying against the DSA?
A: The US is lobbying against the DSA because it believes the act imposes undue restrictions on freedom of expression and could have significant financial costs for US companies operating in the EU. The US administration is concerned that the DSA may stifle political and religious discourse and affect the operations of major tech companies.
Q: What are the main critiques of the DSA?
A: Critiques of the DSA include concerns about overreach in defining illegal content, the potential for limiting children's access to the internet, and the lack of clear guidelines for implementing age verification and age estimation methods. Some critics argue that the DSA oversimplifies complex regulatory trade-offs and may not adequately balance fundamental rights.
Q: How does the EU respond to these critiques?
A: The European Commission rejects the notion that the DSA amounts to censorship and emphasizes that the act does not target US companies. The EU is committed to keeping digital markets open and ensuring that the DSA complies with the principles of the General Data Protection Regulation (GDPR) and other data protection laws.
Q: What is the role of privacy-by-design in the DSA?
A: Privacy-by-design is a core principle of the General Data Protection Regulation (GDPR) and is integral to the DSA. It requires that any age assurance methods must comply with data protection by design and by default, ensuring that they are privacy-preserving and minimize the collection and use of personal data.Â