An association for the defense of liberty accuses the system of being unreliable. For proof, he would have recognized 28 criminals … while they were members of the American Congress. Amazon defends itself by questioning the test.
Amazon is not content to be a giant of e-commerce or hosting in the cloud. He has also developed an intelligent image analysis software open to all, Rekognition, used in particular by police departments and US government agencies to identify and hunt people filmed by CCTV cameras. The facial recognition function of this system is now the subject of intense controversy in the United States.
Over-Represented Color Politicians
He is accused by the American Civil Liberties Union (ACLU) of being unreliable. The latter submitted the photos of all members of Congress to Rekognition, comparing them to a database of criminals. And surprise, the software has wrongly established 28 matches. The US representatives wrongly identified are both men and women, Republicans and Democrats, of all ages and all regions of the United States. On the other hand, people of color, who represent only 20% of the members of Congress, appear in 40% of the results.
For the ACLU, the conclusion is obvious: the software recognizes men of the colorless well than whites and is therefore biased. Given the serious consequences on the lives of people who could be so falsely suspected, she asked for a moratorium on the use of facial recognition by law enforcement. She cites as an example a case in Oregon where Amazon Rekognition is already exploited to compare people’s faces to a database of judicial police photos, without having previously sparked public debate.
“This technology should not be used before the harms are fully taken into account and all necessary measures are taken to prevent harm to vulnerable communities,” reads the official website of the association.
Amazon’s response was quick. The company generally criticizes the ACLU for not giving details of the underperformance of his experiment. In particular, she points out that the matches fitted to the hairpin would only be false positives obtained with the wrong parameters. The reliability rate requested was only 80%. Acceptable to recognize a celebrity or a family member in the photos of an application. But Amazon recommends taking only 99% accuracy rates into account when identifying someone in sensitive situations. By reproducing the ACLU test and not relying on this 99% rate, Amazon claims to have had no complaints. But he brings no evidence of this counter-investigation.
In addition, the group stresses that its software is only a decision aid and is not in any way the ultimate proof. It’s always up to the human to make choices. The company finally regrets not knowing more about the comparison database used by the ACLU, which can be distorted and have caused these incorrect answers. Amazon concludes by recalling that it is constantly improving its system and that machines remain the best way to identify a person, the reliability of eyewitness being lower than that of software.
To the arguments of Amazon, we will oppose the first remark. For the moment, there is no need for a user to use the software by pushing the reliability rate so far. This is perhaps what the company alludes to with this enigmatic passage at the end of its communication. “We should not throw the oven because the temperature could be wrong and burn the pizza. However, it would be reasonable for the government to intervene and specify the temperature (or levels of confidence) that law enforcement agencies must adhere to in order to carry out their security work. ” Amazon calling for government intervention is not common.
White Men Are Better Recognized
This will not change a more fundamental problem. Last February, a study by the MIT Media Lab showed that facial recognition was much more effective on white men, testing tools from Microsoft, Google, Face ++ and IBM. Simply because computers have been trained ever since to recognize white men rather than women or blacks. Learning databases were skewed, and today software performance is tainted. We are therefore waiting for a more rigorous academic study on Rekognition, in order to check that it does not produce results-oriented without knowing it.
This is not the first time that this software is the subject of controversy. Last June, employees petitioned Jeff Bezos, outraged that AWS cloud services and the famous face recognition tool could be used by the Trump administration to track down migrants indirectly. Shareholders and civil rights organizations are also calling for Amazon to stop doing business for these reasons with the government.