Connect with us

Latest News

After the tragedy of George Floyd’s death, where do A.I. and policing go from here?

Published

on

The use of artificial intelligence by law enforcement was already fiercely debated because of fears that the technology could exacerbate existing prejudices or biases against nonwhite communities. But the death of Minneapolis resident George Floyd and the ensuing nationwide protests will likely make the topic even more controversial.

Law enforcement’s use of A.I. is still fairly new, with some police departments using tools like automatic license plate readers and facial recognition software. Future A.I. tools could include more capable drones that can analyze people’s faces from the sky or powerful data analytics services that predict crime in certain areas.

Many civil rights groups worry that police departments, some of which have a history of racial discrimination, will use these technologies to unfairly target communities of color. As Michael McAfee, CEO of advocacy group PolicyLink, says, the effects of over-policing can have a huge impact, including leaving people with criminal records that limit their employment opportunities. 

“We are blessed to live in a nation that is so innovating and creates solutions that can change the quality of our life—no critique there,” McAfee says. “The critique is the lack of consciousness of the other.” 

Jeff Kunins, the chief product officer and executive vice president of software for Axon, best known for making police Tasers and body cameras, says he’s both “incredibly proud and incredibly sad” about the use of his company’s products by police during the current protests. Axon’s shares have soared 21% over the past few days because investors are betting that the business will end up selling more to law enforcement. The use of certain products like live-streaming from body cameras for command staff to watch events as they unfold is already increasing, Kunins says.

Increasingly, Axon is adding A.I. tools to its product line. That includes technology that automatically blurs people’s faces in video, which comes in handy when releasing clips that feature minors in response to public records requests.

Kunins acknowledges that the use of A.I. by law enforcement is a hot-button topic, citing the well-known adage from the Spider-Man comic book that “with great power comes great responsibility.” Axon, for its part, created an ethics board made up of people from outside the company, and said last summer that it would not sell facial-recognition technology, because the software does a poor job of identifying people of color, among other problems.

McAfee says he’s less worried about companies selling A.I. technology and body cameras; his concern is more about the institutions that buy the products. Ultimately, law enforcement agencies will be the ones using the tools.

But that doesn’t mean that vendors are absolved from any responsibility. McAfee believes manufacturers should ask themselves: “Why are my cameras not stopping this kind of abuse when they’re on?”

Kunins argues that body cameras can provide a level of accountability for police by recording potential abuses. But he also concedes that the recent protests underscore that “we have more work to do.” 

“We take as an article of faith that the technology, when used right, is part of the solution—that’s proven—and we are here to continue to make that true,” Kunins says.

Activist groups are concerned, however, that it’s ultimately up to police departments to audit themselves and release videos of possible abuses to the public. But Kunins says the mere fact that the data exists, that police departments routinely release the videos, and that they can be accessed via public records requests is better than nothing.

For Neema Singh Guliani, a senior legislative counsel for the American Civil Liberties Union, the use of A.I. and related surveillance technologies have the potential to “supercharge police abuses and racial inequalities.” She wants local communities to have more say about the kinds of technologies that law enforcement uses, which, she says, are “often deployed in secret. 

“We’ve been stuck in this place now where, with some technologies, it’s just, ‘Let’s get it out into the streets, and we’ll just worry about it later,’” Guliani says.

Kunins says that Axon is considering ways for communities to be more involved in how the company’s products are used or developed, but no final decisions about that have been made.

Other companies that activists have called out for selling A.I. tools to the police include Amazon and Clearview AI. Fortune contacted those companies and will update this story if they respond.

Alice Xiang, the head of fairness, transparency, and accountability research at the tech consortium Partnership on AI, says that supporters of using A.I. in the criminal justice system often say that the technology could “reduce bias of human actors.” At the same time, however, the current protests underscore that there’s “increasing distrust in some of the institutions, of using A.I. tools in the future.”

It’s unclear if the protests will lead local governments to reevaluate the use of A.I. by law enforcement or whether events will accelerate its use.

As McAfee says, U.S. history has these “moments of conflict, and then everything reverts back.”

“It’s the routinized ceremony of black death,” he says.

But he has noticed that some in the tech industry are increasingly sounding alarms over software that can be used to discriminate against communities of color. “In the tech world, there’s an army of folks who care,” McAfee says. “You can see their voices standing up and pushing back at their organizations in ways they never have, and those folks are white.”

More must-read tech coverage from Fortune:

Lyron Foster is a Hawaii based African American Musician, Author, Actor, Blogger, Filmmaker, Philanthropist and Multinational Serial Tech Entrepreneur.

Continue Reading
Comments