Published Date : 8/8/2025Â
A recent report by Civio, a public interest journalism outlet, has highlighted significant risks and data privacy issues with a new biometric patient identification system being implemented in the autonomous cities of Ceuta and Melilla, Spain. The project, awarded to Dedalus and Facephi in 2021, aims to use AI-powered facial recognition technology to identify at least 170,000 patients in these regions.
The system has already been operational in some primary care clinics since November last year, but it is still in the pilot phase in major hospitals. Civio obtained and reviewed the data protection impact assessment (DPIA) of the project, uncovering several concerning issues.
The report, part of Civio's series on Algorithms, describes the facial recognition system provided by the health service for Ceuta and Melilla (INGESA) as having a very high initial risk. The review of the DPIA revealed inconsistencies and insufficient data protection guarantees, falling short of international standards. INGESA has also been criticized for a lack of transparency in its communication about the project's purpose and implementation timeline. Civio noted that INGESA did not respond to its request for comment.
Among the concerns raised by the report are the lack of a clear mechanism for obtaining patient consent, the potential for bias and exclusion based on race and gender, and issues of biometric data security. Previous breaches in the health sector have heightened these concerns. Human rights advocates fear that these issues could lead to a pushback from patients, who are the intended beneficiaries of the system.
To address these fears, INGESA has been advised to allow for alternative identification methods such as health cards and passports, justify the use and proportionality of the system, and provide assurances that it will not be used for surveillance or as a tool for intruding into patients' privacy. Face biometrics has been hailed as the future of patient identification in healthcare, but the associated risks related to data privacy cannot be ignored.
In Spain, the Data Protection Agency (AEPD) has been vigilant about facial recognition deployments without proper data protection safeguards. For instance, in 2023, the AEPD fined the organizers of the Mobile World Congress 200,000 Euros (about US$220,000) for installing a facial recognition system without conducting a prior data protection impact assessment. Earlier this year, the AEPD also requested DPIA details from some football clubs implementing stadium biometrics projects.
The implementation of this biometric patient identification system in Ceuta and Melilla is a significant step towards modernizing healthcare, but it must be done with careful consideration of data privacy and security to ensure that it benefits patients without compromising their rights.Â
Q: What is the purpose of the biometric patient identification system in Ceuta and Melilla?
A: The purpose of the biometric patient identification system is to use facial recognition technology to identify patients, aiming to improve the efficiency and accuracy of healthcare services.
Q: Which companies were awarded the contract for this project?
A: The contract for the biometric patient identification system was awarded to Dedalus and Facephi in 2021.
Q: What are the main concerns raised by the investigative report?
A: The main concerns raised by the report include the lack of a clear mechanism for obtaining patient consent, potential bias and exclusion based on race and gender, and issues of biometric data security.
Q: What has the Spanish Data Protection Agency (AEPD) done to address data privacy concerns?
A: The AEPD has fined organizations for using facial recognition without proper data protection safeguards and has requested detailed data protection impact assessments from entities implementing biometric systems.
Q: What measures have been suggested to address the privacy concerns of patients?
A: To address privacy concerns, INGESA has been advised to allow alternative identification methods, justify the use and proportionality of the system, and provide assurances that it will not be used for surveillance or intrusion into patients' privacy.Â