Constitutional concerns over facial-recognition security
The use of facial-recognition technology has “far-reaching and alarming” implications for privacy and human rights, the Bermuda Human Rights Commission has warned.
The organisation was also concerned that employment of the artificial intelligence could be incompatible with the Constitution and other international agreements that the island was signed up to.
Facial recognition is among a suite of upgrades in a new security camera network being installed across the island.
However, the Government said last night that unspecified “practical challenges” meant the feature would not be put into use at present.
The HRC said this week that it was continuing to examine the full implications of the technology generally as well as its use in Bermuda.
Lisa Reed, its executive officer, added: “The Office of the United Nations High Commissioner for Human Rights has commented on the need to take urgent action to assess and address the serious risks that facial-recognition technology poses to human rights.
“The implementation of facial-recognition technology has far-reaching and alarming implications for human rights and privacy, and may be deemed incompatible with the Constitution and international covenants that Bermuda is a signatory to.”
Ms Reed said the technology raised “significant human rights concerns”, including the right to private life and non-discrimination, “especially in circumstances where the legal and policy framework for the use of facial-recognition technology is inadequate”.
Privacy concerns about security camera systems in Bermuda go back more than 25 years.
In 1997, when the Corporation of Hamilton announced moves for a $1.3 million closed-circuit television system, lawyer Tim Marshall protested it would “trample upon the privacy rights of law-abiding citizens”.
Last month, the Free Democratic Movement said the island’s new network had led to privacy concerns in the Loyal Hill area of Devonshire, and claimed its use could infringe on personal freedoms.
That prompted reassurances from Michael Weeks, the Minister of National Security, as well as the Commissioner of Police.
Ms Reed — in comments made before the Government sent its response last night — said the HRC had “not yet” discussed facial-recognition technology with the Government, and could not speak to “what safeguards, controls or frameworks are being put in place”.
She added: “We encourage that consideration be given to the compatibility between facial recognition technology and constitutional obligations, international obligations, and relevant privacy and data protection laws.
“Direct engagement with the independent offices on the island, such as the Privacy Commissioner's Office and the Information Commissioner's Office, may be of assistance.”
The Government said last month that police training in the system included “mastering AI technologies such as facial recognition, object tracking, and anomaly detection”.
Mr Weeks and Darrin Simons, the Commissioner of Police, also sought to ease the public’s privacy concerns over the CCTV network.
The national security minister insisted that “no rights to privacy will be compromised”, while Mr Simons said there were “robust protocols” governing the use of cameras.
Mr Weeks told the House of Assembly last July that the 247 new cameras in the island-wide CCTV network included facial and licence plate recognition, among other features.
Last night a spokeswoman for the national security ministry said that while an earlier announcement referenced training on facial recognition, there were no plans to use the function “at this time”.
She added: “Facial recognition has practical challenges that make its implementation impractical in our current system.
“The ministry's primary focus is balancing public safety with the necessary expectation of privacy.
“Additionally, prior to the full commissioning of the new CCTV system, meetings will be held with the Privacy Commissioner to ensure privacy protections for Bermudians while the Government provides the police with additional tools to investigate crimes."
The Royal Gazette sent queries to the ministry on April 29 on the type of software and its accuracy rate, and whether steps were taken to ensure problems in identification did not surface on the island.
The questions were raised in the wake of reports overseas faulting the technology after campaigners noted racially biased mistakes.
In particular, an error rate of up to 35 per cent for recognising Black females was cited by organisations such as the American Civil Liberties Union.
Police cameras have been in use in Bermuda for decades, with an upgrade in 2014 that included licence-plate reading.
The new CCTV system, expected to be in place this summer after getting delayed by persistent bad weather, comes with “improved capabilities in tracking, verifying, and recognising individuals and vehicles through sophisticated imaging technology”, police said last month.
Specialised training courses were held at police headquarters in Prospect on the network’s video and analytics system.
A 2018 report by the Massachusetts Institute of Technology cited an error rate by AI of 0.8 per cent for light-skinned men, compared with 34.7 per cent for darker-skinned women.
Gideon Christian, an assistant professor at the faculty of law for the University of Calgary, said in an August 2023 report that there was a “false notion that technology, unlike humans, is not biased”.
He added: “In some facial recognition technology, there is over 99 per cent accuracy rate in recognising White male faces. But, unfortunately, when it comes to recognising faces of colour, especially the faces of Black women, the technology seems to manifest its highest error rate, which is about 35 per cent.”
The department store Macy’s was reportedly hit with a lawsuit after facial recognition was blamed for an alleged wrongful arrest for robbery in October 2023.
Last August, a Black woman in Detroit who was eight months pregnant was arrested for car theft and robbery. She later sued the police over allegations of mistaken identity.
The journal Scientific American warned in May 2023 that facial recognition software was prone to mistakes with Black features, leading to false arrests.
The algorithms running the technology were said to be skewed towards White faces, with the magazine recommending a diversity of images fed into the system.
The report added: “The time for blind acceptance of this technology has passed. Software companies and law enforcement must take immediate steps towards reducing the harms of this technology.”
Need to
Know
2. Please respect the use of this community forum and its users.
3. Any poster that insults, threatens or verbally abuses another member, uses defamatory language, or deliberately disrupts discussions will be banned.
4. Users who violate the Terms of Service or any commenting rules will be banned.
5. Please stay on topic. "Trolling" to incite emotional responses and disrupt conversations will be deleted.
6. To understand further what is and isn't allowed and the actions we may take, please read our Terms of Service