Urgent Ban on Biometric Surveillance Demanded by UK Advocacy Groups
Published Date: 30/07/2024
A coalition of 17 civil society organizations urges the UK Home Secretary to impose safeguards on AI systems in policing, advocating for a complete ban on predictive policing and biometric surveillance.
"A coalition of 17 civil society organizations, including the Open Rights Group, Big Brother Watch, Liberty, and the Network for Police Monitoring, has written to the UK Home Secretary to demand urgent action on biometric surveillance and predictive policing. The groups, collectively known as the #SafetyNotSurveillance coalition, argue that Artificial Intelligence (AI) systems used in policing perpetuate and amplify discrimination based on racial and ethnic origin, nationality, socio-economic status, disability, gender, and migration status.
The coalition contends that predictive policing, which uses data and algorithms to identify and profile individuals or locations for predicting criminal acts, should be banned. They highlight the need for transparency, accountability, and legislative regulation for all other AI uses in policing.
Sara Chitseko, Pre-crime Programme Manager for the Open Rights Group, emphasized the dangers of AI and automated systems in policing, stating, 'AI and automated systems have been proven to magnify discrimination and inequality in policing. Of particular concern are so-called 'predictive policing' and biometric surveillance systems which are disproportionately used to target racialised, working class and migrant communities.'
The letter specifically addresses the contentious use of facial recognition technology by police, which has faced opposition from civil liberties groups. The coalition calls for a clear legal framework from the House of Lords Justice and Home Affairs Committee.
The adoption of facial recognition technology by UK police has expanded significantly in recent years, with several forces moving forward with permanent implementations. Despite these advancements, there are ongoing concerns about potential biases and the proportionality of deploying such technology extensively.
The UK Home Office has allocated ÂŁ55.5 million to enhance police capabilities using facial recognition, particularly targeting retail crimes like shoplifting. This funding supports the Retail Crime Action Plan, which leverages CCTV footage to identify offenders through the Police National Database."
FAQs:
"Q: What is the primary concern of the #SafetyNotSurveillance coalition?
A: The coalition is concerned about the discrimination and inequality perpetuated by AI systems in policing, particularly against racialized, working-class, and migrant communities.
Q: What is predictive policing?
A: Predictive policing uses data and algorithms to identify and profile individuals or locations for predicting criminal acts.
Q: How much funding has the UK Home Office allocated for facial recognition technology?
A: The UK Home Office has allocated ÂŁ55.5 million to enhance police capabilities using facial recognition technology.
Q: What is the Retail Crime Action Plan?
A: The Retail Crime Action Plan is a initiative that leverages CCTV footage to identify offenders through the Police National Database, particularly targeting retail crimes like shoplifting.
Q: Which organizations are part of the #SafetyNotSurveillance coalition?
A: The coalition includes 17 civil society organizations, including the Open Rights Group, Big Brother Watch, Liberty, and the Network for Police Monitoring."
Biometric Products & Solutions
BioEnable offers a wide range of cutting-edge biometric products and solutions: