Pages

Saturday, May 11, 2019

How facial recognition became a routine policing tool in America



The technology is proliferating amid concerns that it is prone to errors and allows the government to expand surveillance without much oversight.



Police are increasingly using facial recognition to solve low-level crimes and to quickly identify people they see as suspicious.Claire Merchlinsky / for NBC News



May 11, 2019, 4:19 AM ‎EDT
By Jon Schuppe

In August 2017, a woman contacted the Arapahoe County Sheriff’s Office in Colorado with what seemed like a simple case: After a date at a bowling alley, she’d discovered $400 missing from her purse and asked the manager to review the surveillance footage, which showed her companion snatching the cash while she bowled a frame.

But despite the clear evidence, the search for the bowling companion floundered. The woman knew only his first name. He’d removed his profile from the dating site on which they’d met. His number, now disconnected, was linked to a hard-to-trace “burner” phone. Security video captured his car in the parking lot, but not its license plate.

The investigator, Tara Young, set the case aside to work on others. It sat on a shelf until early 2018, when she ran into a colleague who was testing out the department’s new facial recognition system.

Young gave the officer a picture of the bowling companion taken from the victim’s cellphone. He plugged it into the software and up popped a mugshot of a man who looked a lot like the date thief.

It was Young’s first experience with facial recognition, one of the most powerful and controversial technological innovations of the 21st century. It gave her dormant case new life, and showed her its potential to transform policing.

Her investigation “would have been at a dead end without the facial recognition,” Young said. “It’s huge.”


A disputed tool goes mainstream

The technology-driven revolution in policing is unfolding in big cities and small communities around the country, as more police departments purchase facial recognition software. The government “facial biometrics” market — which includes federal, state and local law enforcement — is expected to soar from $136.9 million in 2018 to $375 million by 2025, according to an estimate by market research firm Grand View Research. Driven by artificial intelligence, facial recognition allows officers to submit images of people’s faces, taken in the field or lifted from photos or video, and instantaneously compare them to photos in government databases — mugshots, jail booking records, driver’s licenses.


Unlike DNA evidence, which is costly and can take a laboratory days to produce, facial recognition requires little overhead once a system is installed. The relative ease of operation allows officers to make the technology part of their daily work. Rather than reserve it for serious or high-profile cases, they are using it to solve routine crimes and to quickly identify people they see as suspicious.

But these systems are proliferating amid growing concern that facial recognition remains prone to errors — artificial-intelligence and privacy researchers have found that algorithms behind some systems incorrectly identify women and people with dark skin more frequently than white men — and allows the government to expand surveillance of the public without much oversight. While some agencies have policies on how facial recognition is used, there are few laws or regulations governing what databases the systems can tap into, who is included in those databases, the circumstances in which police can scan people’s photos, how accurate the systems are, and how much the government should share with the public about its use of the technology.

Police praise the technology’s power to improve investigations, but many agencies also try to keep their methods secret. In New York, the police department has resisted attempts by defense attorneys and privacy advocates to reveal how its facial recognition system operates. In Jacksonville, Florida, authorities have refused to share details of their facial recognition searches with a man fighting his conviction for selling $50 of crack. Sometimes people arrested with the help of facial recognition aren’t aware that it was used against them.

Because police don’t treat facial recognition as evidence for presentation in court, the technique does not often turn up in public documents and has not been the subject of many judicial rulings. Its use, and spread, are difficult to track.

The companies that build the technology are also grappling with the implications of its use. Amazon has given its facial recognition system to police departments to try out, sparking protests from employees, shareholders and artificial intelligence researchers. Microsoft says it has resisted requests to sell its products to police, and has called for government regulation. Axon, the largest maker of body cameras in the United States, has taken out patents for facial recognition applications but says it is not pursuing them as it consults with an artificial-intelligence ethics board.

At the same time, companies are creating even more advanced systems that will allow police to identify people from live video footage, such as body cameras, rather than just still images. It is only a matter of time before such technology is available for police to buy.





Real-time facial recognition could be a revolutionary policing tool. It's also terrifying.July 31, 201804:49


These developments have triggered attempts to curb police use of facial recognition. The cities of Somerville, Massachusetts, and San Francisco and Oakland, California, are considering banning it. So is the state of Massachusetts.


Civil rights advocates, privacy researchers and criminal defense lawyers warn that the ubiquity of police-run facial recognition systems could cause officers to become overly reliant on a flawed technology and risk wrongful convictions. They say it could trigger an explosion in arrests for petty crimes, exacerbate the criminal justice system’s disproportionate impact on the poor and minorities, and lead to even more routine uses ─ already deployed in China ─ such as exposing jaywalkers and people who take toilet paper from public restrooms.

“It could have a panopticon effect where you’re worried that the government is always watching out,” said Jake Laperruque, senior counsel at the Project on Government Oversight, a nonprofit that investigates the federal government.


‘We try to use it as much as we can’

As these debates unfold, police are continuing to adopt the facial recognition technology, a trend that began two decades ago, when Pinellas County, Florida, won a series of federal grants that allowed it to become a testing ground for the emerging technology. Sheriff Bob Gualtieri said the technology has changed policing almost entirely for the better, allowing his investigators to identify bank robbers, missing persons, even people who’ve been killed in car crashes. “We solve crimes we otherwise wouldn’t have solved,” Gualtieri said.


The technology has since crept across the country, to Los Angeles, San Diego, Chicago and New York, as well as hundreds of state and local law enforcement agencies.


Because there is no easy way to measure how many police departments have adopted facial recognition, any effort to do so provides only a glimpse. The most comprehensive assessment was conducted by the Georgetown Center on Privacy and Technology, which found in 2016 that at least one in four police agencies could run facial recognition searches, either through a system they’d purchased themselves or one owned by another agency. (For example, Pinellas County’s system, which includes millions of Florida driver’s licenses and law enforcement photos, is available to nearly 300 other law enforcement agencies.) The technology has surely spread since then.

In Colorado, local investigators have foiled credit-card fraudsters, power-tool bandits and home-garage burglars and identified suspects in a shooting and a road-rage incident. In San Diego, officers snap pictures of suspicious people in the field who refuse to identify themselves. The technology has led to the capture of a serial robber in Indiana, a rapist in Pennsylvania, a car thief in Maine, robbery suspects in South Carolina, a sock thief in New York City and shoplifters in Washington County, Oregon.

In southwestern Ohio, officers are dumping images from Crime Stoppers alerts into their newly acquired facial recognition system and solving all sorts of property crimes.

“We try to use it as much as we can,” said Jim Stroud, a police detective in Springfield Township, a Cincinnati suburb.


In Lakewood, Colorado, Det. Mark Gaffney patrols with a cellphone equipped with an app that allows him during traffic stops and other roadside encounters to take pictures of people whose name he can’t confirm and run them through a facial recognition system. Nearly instantaneously, the program gives a list of potential matches loaded with information that can help him confirm the identity of the people he’s stopped ─ and whether they have any outstanding warrants. Previously, he’d have to let the person go or bring them in to be fingerprinted.

“When I don’t know who a person is, I have more options available to me if they don’t want to tell me,” Gaffney said.


Inside one department’s operation

Few local law enforcement agencies talk openly about how they use facial recognition. Among the exceptions is the Arapahoe County Sheriff’s Office, which allowed investigators to describe how the technology has fit into their routine casework.

The agency’s use of facial recognition is typical among law enforcement, with a group of specially trained investigators acting as liaisons to the rest of the department. Every day, they sit at computer screens at the agency’s headquarters in Centennial, Colorado, plugging images into their year-old facial recognition system. The photos come from their own case files, Crime Stoppers bulletins and big-box retailers such as Home Depot and Target hit by serial shoplifters. They also take requests from detectives at other police agencies.


The amount of material is massive, particularly for property crimes, because, generally speaking, if there’s a place with something worth taking — a city street, a shop, a house — there is probably also a surveillance camera or a phone-wielding witness.


The images are uploaded into software called Lumen, sold by Numerica Corp., a defense and law enforcement contractor. An algorithm compares them to a database of mugshots and booking photos shared by law enforcement agencies across the Denver region.

Within seconds, dozens of possible matches flash onto the screen, ranked by how similar the algorithm thinks they are. The investigators scroll through the list, looking for a potential match, adding filters such as gender, race and the color of the subject’s eyes and hair. The algorithm’s confidence in the match never hits 100 percent. If the subject’s photo is grainy or captures their face from the side, or if they are wearing a hood or sunglasses, there may not be any worthwhile results. Sometimes investigators find a promising hit near the top of the rankings; other times the actual suspect turns up far down the list, ranked at perhaps 75 percent.

Any apparent connections then have to be verified. The investigator gets to work, running the potential match through criminal databases for more clues. The investigator might also set up a photo array, including the potential match and photos of similar looking people, to show a witness.


“The facial recognition identification is just an investigative lead. It doesn’t establish probable cause to make an arrest,” said Rick Sheets, an Arapahoe County investigator who specializes in property crimes.

Arapahoe investigators who have been trained on Lumen encourage their colleagues not to shelve cases before trying it. That is what happened in March 2018, when Investigator Jim Hills told Young about the system, which had been installed a few weeks earlier.

Young thought of the woman who’d been robbed of $400 during the date at the bowling alley seven months earlier.

“This might be a good one to try,” she said.


A mystery solved

Young gave Hills a close-up cellphone picture of the date thief. The victim, Debbie Mallick, had shared it early in the investigation, along with screenshots of text messages in which he’d admitted to stealing her $400, just before he shut down his phone.


Hills entered the snapshot into Lumen. Near the top of the results was an old booking photo of a man with dozens of prior arrests, including convictions for assault, car theft and violating restraining orders. He looked remarkably similar. His first name was Antonio, the same as the thief’s.

Young arranged for a colleague to show Mallick a photo array. Mallick picked him out immediately. Young filed an arrest warrant charging the man, Antonio Blackbear, with misdemeanor theft.

Another two months passed before Young got an alert from police in Denver, who said they’d taken Blackbear into custody. In June 2018, he pleaded guilty, was sentenced to 25 days in jail and was ordered to pay back Mallick.

Blackbear, 40, could not be reached for comment.


Mallick, 47, said she still hasn’t gotten her money back, which was all she wanted in the first place. The experience hurt her deeply, and made her less trusting of people. But the mother of two said she was grateful the investigators didn’t give up.

She had no idea how police identified Blackbear.

In arrest reports and court documents detailing the case, there is no mention of facial recognition.



No comments:

Post a Comment