It’s time for Amazon to head back to the drawing board.

The mega company headed by world’s richest man Jeff Bezos has been busy developing recognition software that could be used for surveillance, drones, and other creepy things. Turns out, it doesn’t work very well. As Engadget reports, the American Civil Liberties Union put the software, named Rekognition, to the test and came up with some odd results.

When comparing "every current member of the House and Senate" against a database of 25,000 public mugshots, Rekognition identified 28 lawmakers as criminals.

Yes, hehe, the government is full of cronies that probably have committed crimes in either a legal or moral sense, but seriously, this is not good. Facial recognition software often falls short, particularly for people of color (because it’s mostly developed by white people)—remember Apple’s horrible problem with its iPhone X facial recognition of Asians? When it comes to putting it to use for catching criminals, this could be damning technology for the innocent.

Unsurprisingly, six of the false matches included members of the Congressional Black Caucus, including civil rights activist and representative John Lewis from Georgia. "Nearly 40 percent of Rekognition's false matches were of people of color, even though they make up only 20 percent of Congress," according to the ACLU.

Herein lies the problem with giving this kind of technology over to police: It only reaffirms racial bias that leads to the police brutality killing so many innocent black and brown people today. Amazon pitched Rekognition to law enforcement for stationary cameras and body cams, despite investors' plea for them not to.

Amazon also refused to accept the results of the ACLU’s test. According to Engadget, the ACLU used the default 80 percent confidence interval for its experiment, which Bezos and Co. claimed wasn't appropriate. "While 80 percent confidence is an acceptable threshold for photos of hot dogs, chairs, animals, and other social media use cases, it wouldn't be appropriate for identifying individuals with a reasonable level of certainty," the company told the Verge.  

Amazon recommends using a non-default 95 percent confidence interval, but there’s no certainty law enforcement would use it, or that it would really make that much of a difference. If Amazon plans to leave things as is, this could be detrimental for communities of color and more than a little problematic for our legal system.

"Our test reinforces that face surveillance is not safe for government use," Jacob Snow, the ACLU's technology and civil liberties attorney for Northern California, told Engadget. "Face surveillance will be used to power discriminatory surveillance and policing that targets communities of color, immigrants and activists."