AWS Bans Cops From Using Rekognition For One Year

The cloud leader has changed course on providing facial recognition technology to law enforcement after last week’s sweeping Black Lives Matter protests. AWS says it hopes sales can resume once Congress does more to regulate potential abuses of the technology.


Last February, AWS CEO Andy Jassy defended law enforcement’s use of Amazon Rekognition, its facial recognition service.

But with the sweeping protests against police abuse that took place last week, the cloud leader has changed course by enacting a one-year moratorium on allowing cops to use the artificially intelligent technology.

In a short blog without a named author, Amazon said it would like to see laws protecting abuse of the technology. AI-enabled facial recognition is controversial because of its potential to intrude on privacy and facilitate racial profiling.

Sponsored post

[Related: Andy Jassy: No On AWS Spinoff; Yes On Facial Recognition Regs]

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the AWS blog reads. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

Amazon added it will allow organizations to continue using Rekognition to combat human trafficking and reuniting families with missing children.

The shift comes a day after IBM CEO Arvind Krishna said Big Blue has stopped selling facial recognition software and opposes the use of any technology for mass surveillance, racial profiling and violating human rights.

Krishna on Monday sent a letter to Congress stating the use of facial recognition or analysis software for such purposes is "not consistent with our values and Principles of Trust and Transparency."

"We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies," Krishna said.

At the AWS re:Invent conference in November of 2016, AWS introduced Rekognition, a product that can detect objects in an image, identify people, match faces, and even perform a sentiment analysis, determining if the person in the photo is smiling or frowning.

In a Frontline documentary that aired in February, Jassy defended law enforcement’s use of the cutting-edge product.

“We believe that governments and the organizations that are charged with keeping our community safe have to have access to the most sophisticated modern technology that exists,” he told the television program.

“Frontline” noted concerns that Rekognition wasn’t market-ready, and police were “essentially field testing it on the public on behalf of the company” with no clear regulations governing its use.

In response, Jassy said he welcomed government regulation.

“We’ve never had any reported misuse of law enforcement using the facial recognition technology,” Jassy said. “I think a lot of societal good is already being done with facial recognition technology. Already you've seen hundreds of missing kids reunited with their parents, and hundreds of human-trafficking victims saved, and all kinds of security and identity and education uses. At the end of the day, with any technology…the people that use the technology have to be responsible for it. And if they use it irresponsibly, they have to be held accountable.”

Jassy said the vast majority of police departments are using it according to Amazon’s prescribed guidance.

“And when they're not, we have conversations, and if we find that they're using in some irresponsible way, we won't allow them to use the service and the platform,” he said.