CRBC News
Security

Edmonton Begins Daylight Trial of Axon AI Body Cameras That Flag 'High-Risk' Faces

Edmonton Begins Daylight Trial of Axon AI Body Cameras That Flag 'High-Risk' Faces

Edmonton police have begun a daylight-only trial of Axon body cameras that use facial recognition to flag people on a “high-risk” list (about 6,341 names plus 724 people with warrants). The pilot involves roughly 50 officers, runs through December, and currently records matches for human review at stations. Critics — including a former Axon ethics chair — call for more transparency, independent testing and legislative oversight. Alberta’s privacy regulator is reviewing the project’s privacy impact assessment.

Edmonton Pilots AI-Enabled Body Cameras Amid Privacy and Bias Concerns

Edmonton police have launched a daylight-only pilot of body-worn cameras equipped with facial recognition that can flag people on the city’s “high-risk” watch lists. Axon Enterprise — the U.S. company best known for Tasers and police body cameras — says the system has been trained to detect roughly 7,000 people (6,341 on the main watch list plus 724 people with at least one serious warrant).

What the Pilot Involves

The trial, which began in early December and is scheduled to run through the end of the month, involves about 50 officers and will operate only during daylight hours to reduce the impact of low light and extreme cold on performance. In the current setup, officers will not receive immediate alerts in the field; flagged matches are to be reviewed later by humans at a station. Police say future uses could include near-real-time alerts when officers are responding to a call or actively investigating.

Official Rationale

Edmonton Acting Superintendent Kurt Martin said the pilot aims to improve officer safety by helping identify individuals flagged for categories such as “violent or assaultive,” “armed and dangerous,” “weapons,” “escape risk,” and “high-risk offender.” Ann-Li Cooke, Axon’s director of responsible AI, emphasized that the project is intended to be targeted to people with serious offenses.

"We really want to make sure that it’s targeted so that these are folks with serious offenses," said Ann-Li Cooke of Axon.

Ethics, Oversight and Public Scrutiny

Critics, including Barry Friedman — a law professor at New York University and former chair of Axon’s AI ethics board — warn that the pilot has proceeded without sufficient public debate, independent testing and transparent oversight. Friedman and other former board members previously urged Axon to pause facial-recognition deployments in 2019 because of accuracy and ethical concerns.

"It’s essential not to use these technologies, which have very real costs and risks, unless there’s some clear indication of the benefits," Friedman said.

Axon CEO Rick Smith has described the Edmonton trial as "early-stage field research" rather than a commercial launch, saying real-world testing outside the U.S. could help build oversight frameworks and inform future evaluations. Axon has also said it does not develop its own face-recognition model and declined to identify the third-party vendor supplying the technology for this test.

Accuracy, Bias and Legal Context

Researchers have documented accuracy problems with facial-recognition systems, including biased performance by race, gender and age, and reduced accuracy on live video compared with static photos. Axon acknowledged that factors such as distance, lighting and camera angle can disproportionately affect accuracy for darker-skinned people and said every algorithmic match will require human review as part of the pilot.

Regulatory responses differ internationally: the European Union has broadly banned real-time public face-scanning by police except for narrow cases involving serious crimes, while the United Kingdom has tested and used the technology in recent years. Several U.S. states and many cities have limited police use of facial recognition.

Privacy Review and Community Concerns

Alberta’s information and privacy commissioner, Diane McLeod, confirmed her office received a privacy impact assessment from Edmonton police on Dec. 2 and is reviewing it — a required step for projects handling "high sensitivity" personal data. Local academics and community leaders have noted strained relations between Edmonton police and Indigenous and Black residents, underscoring concerns that the technology could exacerbate existing inequities.

University of Alberta criminologist Temitope Oriola called Edmonton "a laboratory for this tool," saying it may prove beneficial but that evidence is not yet conclusive. Observers and ethics experts say greater transparency, independent testing and legislative oversight are needed before wider deployment.

What’s Next

The pilot will produce data and operational findings that Axon and Edmonton police say will inform whether and how the technology might be expanded. Critics want the company and the police service to publish evaluations, disclose the face-recognition vendor, and allow independent scientific review before any broader rollouts.

Similar Articles