Facial Recognition Software and the Rise of Digital Police Enforcement

The San Diego Police Department embrace the tech world.

Not Available Lead
Complex Original

Image via Complex Original

Not Available Lead

The San Diego Police Department are embracing the tech world. But will their use of facial recognition software cause more harm than good?

 

One of side effects of keeping a digital log of our lives is the compression of time it seems to trigger. On Facebook you can never cease being the awkward figure you were in 2006, unaware the world had a jury bigger than your high school’s sophomore class waiting. In the years since, whatever identity you have made for yourself can quickly seem like a performance, a sophisticated covering up of the ugly and unpresentable self you’d likely rather see interred than kept alive in digital limbo.

Over the last year, the San Diego Police Department has been using a technology that will add legal consequence to this game of metaphysical ennui. Officers have had a facial recognition app on both Android phones and tabletsthat allows them to upload a picture taken of any person they suspect of criminal behavior and upload it to servers where it will be cross-checked against an archive of booking photos for past offenders. 

The San Diego police’s uses for facial recognition only seem to be deepening the alienation of those drawn into the American legal system, ensuring they will never be able to build new identities separate from all their past selves.

The idea of facial recognition software has slowly crept into all kinds of consumer technology over the last few years. Android phones have supported facial recognition in lieu of security codes for years, and both PlayStation 4 and Xbox One will be able to log players in with facial recognition features. A UK company has been testing facial recognition software at a number of expensive shops and hotels in order to improve service for celebrities and rich shoppers who’ll be recognized and pampered as soon as they walk in the door. Earlier this year a Department of Homeland Security project with the ignominious codename BOSS (Biometric Optical Surveillance System) was uncovered, which would tentatively scan through crowds to search for wanted criminals.

These uses—always entertainment-driven or distantly experimental—made it easy to think of facial recognition as a toy, a weird pleasure, like the first time you saw yourself tagged in someone else’s picture on Facebook. Like all new kinds of technology, there is a simple pleasure in the first encounter, a new and not entirely clear environmental phenomenon emerges and it’s enough to just marvel at how nice having a new thing can be.

But not everyone benefits from having new things in equal measure. The San Diego police’s uses for facial recognition only seem to be deepening the alienation of those drawn into the American legal system, ensuring they will never be able to build new identities separate from all their past selves. The use of FaceFirst’s tool is authorized during any “traditional police-civilian encounters,” a sufficiently vague description that ensures individual prejudice will be able to steer the process.

In documents reported by The Center for Investigative Reporting, one officer had been interviewing the neighbor of a suspect when he felt his “spidy senses” engage. Though the neighbor was being cooperative and had done nothing wrong, the officer decided to roll the dice and take the man’s picture and run him through the system. In about eight seconds the system came back with a 99.6 percent match for an illegal immigrant with a DUI conviction from 10 years earlier. In its first 10 months of use in San Diego, the tool has been used for 5,629 similar queries, where a police officer sees someone they don’t like and turns to the oracular computer to tell them why they were probably right to have that reaction. 

The hell of this system doesn’t come from the camera, but from the human who takes the information from it as justification for cuffing and caging anyone with the wrong digital history.

These haphazard incursions into a person’s private life based on a prejudicial hunch by someone with a gun or some other structural form of power are not new. Last month, Barneys had a 21 year-old man jailed for two hours after buying a $350 belt. The man was black and one of the department store employees called police after the purchase suspecting he had used a fraudulent debit card to complete the transaction. There was no evidence of wrongdoing, no inappropriate behavior, and after two hours the man was freed and the suspicions proven baseless. But before that happened he was stopped, interrogated, and imprisoned by people with guns acting on a superstitious gut reaction.

The social structure that normalizes wildly invasive excesses like these have a new tool that will allow access to a vast network of circumstantial evidence to justify prejudices in reverse, stopping people not because they’re doing anything wrong but because of a hunch that the person they don’t like has some as yet unpunished secret past that must have its public reckoning. Enabling the government to create an imperishable digital catalog of people it has punished, or would like to, is another way of shackling a person to the worst interpretations of them held by other people.

The most unacceptable element of facial recognition software is the appearance of impartiality the machines give to a process of police harassment and violence. “Photographs are neutral—you can’t say it’s racist when a camera is taking a neutral picture of someone,” one officer said in an interview. The hell of this system doesn’t come from the camera, but from the human who takes the information from it as justification for cuffing and caging anyone with the wrong digital history.

Since there is no reason to believe the humans operating these neutral tools intend to use them neutrally, there can be no ultimate good served by arming them with facial recognition. If they can’t do their jobs with guns and nightsticks, they’ll probably never be able to.

Michael Thomsen is Complex's tech columnist. He has written for Slate, The Atlantic, The New Inquiry, n+1, Billboard, and is author of Levitate the Primate: Handjobs, Internet Dating, and Other Issues for Men. He tweets often at @mike_thomsen.

Latest in Pop Culture