The Calgary Police Service has confirmed two of its officers tested controversial facial-recognition software made by Clearview AI.
While the police service doesn’t use Clearview AI in any capacity, it said two of its members had tested the technology to see if it was worthwhile for potential investigative use.
“Neither officer used the software in any active investigations and both ceased use following the testing,” said a police representative. “Both have been told to delete any active user accounts.”
Calgary police said one of the officers currently works with the service and the other is seconded to another agency.
Last month, it was revealed some Canadian law enforcement agencies were using Clearview AI software. The program uses billions of open-sourced images from popular social media platforms like Facebook and Twitter, which can then be used by authorities to identify perpetrators and victims of crime.
On Wednesday, Clearview AI revealed its client list had been hacked. It came to light that more than 2,200 law enforcement agencies, companies and individuals are using the software, including Toronto Police Service and divisions of the RCMP.
Both the Calgary Police Service and the Edmonton Police Service had denied use of the software earlier this month, but both have since come forward with reports that several of their officers had tested the Clearview AI software.
Related
Staff Sgt. Gordon MacDonald, of the Calgary police criminal identification section, said the service wouldn’t be interested in software that uses open-source images due to ethical concerns.
“As an organization, we wouldn’t be interested in it no matter the benefits it purports to bring,” said MacDonald.
“It’s just so fundamentally and ethically unsafe to start using that as a means to obtain some form of identification. It’s far better to go through our own photographs that we’ve obtained and can verify who these people are.”
Bonita Croft, chair of the Calgary police commission, said the Calgary Police Service has clear policies that guide the use of information technology and monitors to ensure compliance with those policies and privacy laws.
“We understand that CPS is evaluating the situation to determine whether the privacy commissioner needs to be notified,” said Croft. “The guidance of the privacy commissioner has been instrumental in how the CPS uses tools like body-worn cameras and facial recognition technology.”
In Edmonton, Clearview AI facial-recognition programs were used without approval at least twice by that city’s police service, which triggered an investigation by Alberta’s privacy commissioner, Jill Clayton.
She said in a statement that the situation serves as a “wake-up call to law enforcement in Alberta that building trust is critical to advancing the use of new technologies for data-driven policing.”
Three officers used the technology in Edmonton, according to Supt. Warren Driechel. All members have been directed not to use Clearview AI software moving forward.
Calgary police were the first Canadian police force to use facial recognition technology. Since 2014, the service has used biometric software created by the NEC Corp. of America.
Using the technology, police compare photos and videos, such as CCTV images of persons of interest, with their mug shot database of more than 350,000 images taken under the Identification of Criminals Act.
With files from Postmedia Edmonton
Twitter: @alanna_smithh