Edmonton Police Trial AI-Driven Body Cameras with Facial Recognition

In Edmonton, Canada, a city with over 1 million residents, a new pilot project is testing the integration of artificial intelligence into police body cameras. These cameras, designed to detect faces, have been trained to recognize approximately 7,000 people on a “high-risk” watch list. The initiative is the first of its kind in North America and raises critical questions about the future of facial recognition in law enforcement.

This trial, launched just last week, stands in stark contrast to a 2019 stance by Axon Enterprise, Inc., a leading manufacturer of body cameras, which previously raised alarms about the ethical concerns surrounding facial recognition technology. Axon had, at that time, temporarily suspended its use of the technology. However, despite these past reservations, Axon’s new project in Edmonton is stirring unease both locally and across the continent.

Barry Friedman, the former chair of Axon’s AI ethics board and now a law professor at New York University, has voiced concerns over the lack of public discourse and scientific evaluation before moving forward. “The risks and costs of these technologies are significant,” said Friedman. “Without clear benefits, they should not be deployed.”

Rick Smith, the CEO and founder of Axon, however, sees the Edmonton project as a crucial step in "early-stage field research" that will provide valuable insights into the performance of facial recognition and help establish the necessary safeguards for its use in policing. Smith emphasized that this test, conducted outside the U.S., would enable Axon to gather unbiased data and refine its oversight processes before expanding the technology's use.

The primary goal of this initiative is to improve the safety of Edmonton's police officers. By enabling body cameras to identify individuals on the watch list—people flagged for offenses such as violence, armed danger, escape risks, and high-risk crimes—the technology aims to provide real-time alerts to officers. Kurt Martin, acting superintendent of the Edmonton Police Service, confirmed that the watch list currently includes 6,341 individuals, with an additional 724 listed for having active criminal warrants.

The project focuses on ensuring that only individuals with serious offenses are targeted. “We want to make sure this is as precise as possible, focusing on those with the most serious criminal backgrounds,” said Ann-Li Cooke, Axon’s director of responsible AI.

The implications of this pilot are significant not only for Edmonton but for policing worldwide. Axon is a major supplier of body cameras to law enforcement, and the company has increasingly expanded its reach to agencies in Canada and abroad. In a notable victory, Axon secured a contract to provide body cameras for the Royal Canadian Mounted Police, surpassing its closest competitor, Motorola Solutions.

While Motorola also has the capability to integrate facial recognition into its body cameras, the company has opted to refrain from doing so, citing ethical concerns. "Our decision is rooted in our principles. We have not deployed facial recognition for proactive identification, though we cannot rule out using it in the future," said Motorola in a statement.

In 2023, the Alberta government mandated the use of body cameras across all provincial police agencies, including Edmonton’s force. This move is part of a broader initiative aimed at increasing transparency, improving evidence collection, and expediting the resolution of investigations and complaints.

However, the prospect of using real-time facial recognition has sparked considerable opposition. Civil liberties groups and advocates for racial justice have raised concerns about the accuracy and potential bias of the technology. Studies have shown that facial recognition systems can deliver flawed results based on race, age, and gender, and are less reliable when analyzing live video feeds compared to mug shots or ID photos.

Several U.S. states and cities have already sought to limit police use of facial recognition technology, and the European Union has outright banned its use for real-time public face-scanning, except in cases of serious crimes like terrorism or kidnapping. On the other hand, the United Kingdom has been testing the technology since 2013, and authorities have made over 1,300 arrests using it in the past two years. The British government is now considering a nationwide roll-out.

Despite the controversies, the Edmonton pilot proceeds with cautious optimism. Axon has not disclosed the third-party provider responsible for its facial recognition AI but has stated that the test will continue until the end of December, during daylight hours only. As the technology is tested, about 50 officers involved in the pilot will not immediately know if the software matches a face; instead, this will be analyzed later. However, the system could eventually help officers detect dangerous individuals nearby, allowing them to call for backup if necessary. Martin assured that this would only occur during active investigations or responses to calls, not during routine patrols.

“We are committed to balancing security and privacy,” said Martin, explaining that officers will have control over when the cameras are switched to active recording mode with higher resolution.

The Alberta Office of Information and Privacy is reviewing the privacy impact assessment for the project, a step required for projects that handle sensitive personal data.

Criminology professor Temitope Oriola of the University of Alberta believes Edmonton’s pilot is a natural testing ground for such technologies, given their growing ubiquity in sectors like airport security. “Edmonton is a laboratory for this tool,” said Oriola. “It might turn out to be an improvement, but we cannot be sure yet.”

The Edmonton Police Service's relationship with its Indigenous and Black communities, especially following the fatal police shooting of a South Sudanese community member, remains a sensitive issue. Whether facial recognition will improve public safety or further strain community relations is yet to be seen.

Axon’s history with ethical concerns is not new. In 2022, Friedman and other members of Axon’s AI ethics board resigned over the company’s plans to deploy a Taser-equipped drone. Since then, the company has continued its controlled research into facial recognition, asserting that the technology has become more accurate and ready for real-world trials.

However, concerns about accuracy remain. Axon acknowledged that factors such as lighting, distance, and angle could skew facial recognition results, especially for individuals with darker skin tones. Every match, Axon states, will require a human review, and the testing process is designed to fine-tune how human reviewers can mitigate these issues.

Friedman remains critical, stressing the need for transparency and independent oversight. “A pilot is a good idea,” he said. “But there must be accountability. The public and experts must be involved in these decisions, not just vendors and police agencies.”