Brief #56—Civil Rights

Policy Summary: In July 2018, the American Civil Liberties Union (ACLU) published a blog post about the results of an experiment they conducted using’s facial recognition software known as “Rekognition.” The ACLU purchased the software directly from just as an ordinary purchaser would and then used Amazon’s recommended default settings to build a face database and search tool. The ACLU used a publicly available database of 25,000 arrest mugshots for its face database. The ACLU then scanned photos of every current Member of Congress. 28 Members of Congress were falsely identified as having previously committed a crime. LEARN MORE

Analysis: The results of the ACLU’s bold experiment appear to have set the stage for a battle on the future of facial recognition software. has been engaged in the testing, marketing and sale of its facial recognition software since late 2016 and since that time Congress had been reluctant to get involved in the policy discussion regarding the use of the technology tool. In response to the test conducted by the ACLU, released a statement that said, “The ACLU continues to distort the facts to suit [its] purposes.” The statement went on to list the successes the tool has had in helping to find missing children, fight human trafficking, fight financial fraud as well as a number of other incidents that contributed to public safety. On July 27, 2018, twenty – five Members of Congress co – signed a letter to CEO Jeff Bezos asking him to address the concerns of the software’s impact on communities of color. While Congress finally waded into the issue of facial recognition technology, although belatedly and only after being prodded by the ACLU’s test, it was’s response that caught everyone’s attention.

In a blog post, stated, “It is a very reasonable idea, however, for the government to weigh in and specify what temperature (or confidence levels) it wants law enforcement agencies to meet to assist in their public safety work.” This statement referred to the recommended confidence level setting that users of the software should use because the confidence level setting determines the rate of error. A higher setting should logically have a lower error rate while a lower setting increases the chances of false positive matches, as when the ACLU’s test (conducted at a lower setting) falsely identified members of Congress of having committed a crime. This is significant because was now stating that the government and law enforcement agencies alone should determine any standards for the proper use of the software. However, this is contrary to what Amazon had been doing when they had been marketing the software to law enforcement agencies. They had been recommending to the government the proper setting to use to prevent inaccurate face matches but now it was being forced to backtrack when Members of Congress were falsely identified. The contrary actions and statement by Amazon illustrate that the next stage in the future of facial recognition technology is going to be what standards will be implemented to prevent abuse and who will ultimately decide on those standards – the government or the tech companies themselves. The decision is far from settled and could have far reaching consequences. LEARN MORE, LEARN MORE

Engagement Resources:

This brief was compiled by Rod Maggay. If you have comments or want to add the name of your organization to this brief, please contact

Photo by Illia Cherednychenko

Subscribe Below to Our News Service

Pin It on Pinterest

Share This