Published Date : 6/30/2025Â
The U.S. Internal Revenue Service’s (IRS) push to modernize digital identity verification is at the crossroads of fraud prevention, AI oversight, and public trust. Over the past decade, the agency has grappled with identity theft and refund fraud that once siphoned billions from the Department of Treasury. To combat this, the IRS introduced biometric technologies, particularly facial recognition, through initiatives like the Secure Access Digital Interface (SADI) and partnerships with private providers like ID.me. While these efforts have boosted verification rates, a recent GAO audit highlights significant gaps in governance and accountability. n nThe IRS’s shift to biometric identity proofing began in earnest after 2020, driven by the need to streamline services during the pandemic. By 2022, the agency mandated NIST Identity Assurance Level 2 verification across dozens of applications, requiring users to submit selfies and driver’s license photos for facial recognition. Fallback options included live video chats for those unable to use automated systems. This approach expanded access, especially for underserved populations, but also raised concerns about overreliance on a single vendor, ID.me. The program’s success rates jumped from 30% under outdated password systems to over 70% with biometrics, yet the lack of measurable goals and outcome-based benchmarks has left the IRS unable to assess whether these rates meet service needs. n nJay McTigue, director of strategic issues at the Government Accountability Office (GAO), emphasized that while the IRS has improved taxpayer access, its oversight remains insufficient. A June 2025 audit confirmed this, noting that the agency lacks documented procedures to evaluate or share performance data across departments. This creates silos, where cybersecurity teams, procurement staff, and program managers may work with incomplete information. The GAO also criticized the IRS for failing to list ID.me’s AI tools in its official AI inventory, violating transparency mandates under Executive Order 13960 and the Advancing American AI Act. n nAnother critical issue is the IRS’s dependence on ID.me, which was the sole vendor able to meet federal standards under urgent timelines. While ID.me claims compliance and performance success, the GAO warned that the IRS cannot rely solely on the vendor’s self-assessments. Contractual shortcomings further complicate oversight, as the IRS bypassed performance evaluation plans by using a Treasury-run blanket purchase agreement. Although the contract includes privacy safeguards like biometric data deletion within 48 hours, the IRS relies on ID.me’s self-attestation for enforcement, leaving compliance unverified. n nThe risks of these gaps are tangible. As of the 2025 filing season, the IRS flagged over 2.1 million returns as potential identity theft, causing delays in refunds. National Taxpayer Advocate Erin M. Collins highlighted that these cases often take months to resolve, disproportionately affecting low-income filers. With nearly 400,000 pending cases in the Identity Theft Victim Assistance unit, the IRS faces pressure to reduce resolution times to four months. The GAO has since recommended establishing outcome-based goals, systematic evaluations, and formal data-sharing procedures to address these challenges. n nThe IRS agreed to implement all GAO recommendations, but execution remains a hurdle. The agency’s digital ID strategy must balance innovation with transparency, ensuring that biometric tools do not replicate systemic inequalities. While facial recognition and AI have improved efficiency, their unchecked use through third-party providers risks exclusion, opacity, and potential abuse. The IRS’s experience serves as a cautionary tale for federal agencies, emphasizing that technology modernization must be anchored in accountability and user-centric design. n nBiometric identity verification, if poorly governed, can deepen disparities. However, with clear goals, privacy protections, and oversight, it can also enhance security and equity. The IRS’s success—and that of similar programs across the government—will depend not on the sophistication of technology, but on the strength of its governance frameworks.Â
Q: What are the main concerns with the IRS's biometric identity verification program?
A: The IRS's program faces criticism for inadequate oversight, reliance on a single vendor (ID.me), and gaps in AI transparency. A GAO audit highlighted the lack of measurable goals, data-sharing procedures, and compliance with AI governance mandates.
Q: How has the IRS improved identity verification since adopting biometrics?
A: The IRS saw verification pass rates jump from 30% under password systems to over 70% with biometrics. This shift expanded access, especially for underserved populations, by replacing memory-based security questions with facial recognition.
Q: Why is ID.me a point of contention in the IRS's program?
A: ID.me is the sole vendor for the IRS’s biometric verification, raising concerns about systemic vulnerabilities. The GAO warned that the IRS cannot rely solely on the vendor’s self-assessments, as there are no independent audits to verify compliance with privacy and performance standards.
Q: What recommendations did the GAO make to the IRS?
A: The GAO recommended establishing outcome-based performance goals for each IRS application, conducting systematic evaluations of program effectiveness, formalizing data-sharing procedures, and ensuring compliance with AI transparency requirements.
Q: How do delays in identity verification affect taxpayers?
A: Delays in verifying identity have caused multi-month or multi-year waits for refunds, disproportionately impacting low-income filers. The IRS flagged over 2.1 million returns as potential identity theft, with nearly 400,000 cases pending in its Identity Theft Victim Assistance unit.Â