By Catie Edmondson
Aug. 6, 2018
WASHINGTON — The program makes boarding an international flight a breeze: Passengers step up to the gate, get their photo taken and proceed onto the plane. There is no paper ticket or airline app. Thanks to facial recognition technology, their face becomes their boarding pass.
“I would find it superconvenient if I could use my face at the gate,” said Jonathan Frankle, an artificial intelligence researcher at M.I.T. studying facial recognition technology. But “the concern is, what else could that data be used for?”
The problem confronting Mr. Frankle, as well as thousands of travelers, is that few companies participating in the program, called the Traveler Verification Service, give explicit guarantees that passengers’ facial recognition data will be protected.
And even though the program is run by the Department of Homeland Security, federal officials say they have placed no limits on how participating companies — mostly airlines but also cruise lines — can use that data or store it, opening up travelers’ most personal information to potential misuse and abuse such as being sold or used to track passengers’ whereabouts.
The data the airlines collect is used to verify the identity of passengers leaving the country, an attempt by the department to better track foreigners who overstay their visas. After passengers’ faces are scanned at the gate, the scan is sent to Customs and Border Protection and linked with other personally identifying data, such as date of birth and passport and flight information.
John Wagner, the deputy executive assistant commissioner for the agency’s Office of Field Operations, said he believed that commercial carriers had “no interest in keeping or retaining” the biometric data they collect, and the airlines have said they are not doing so. But if they did, he said, “that would really be up to them.”
But, Mr. Wagner added, “there are still some discussions to be had,” and federal officials are considering whether they should write in protections.
Privacy advocates have criticized the agency for allowing airlines to act as unregulated arbiters of the data.
“C.B.P. is a federal agency. It has a responsibility to protect Americans’ data, and by encouraging airlines to collect this data, instead they are essentially abdicating their own responsibility,” said Jennifer Lynch, a senior staff attorney with the Electronic Frontier Foundation, a digital rights nonprofit.
Harrison Rudolph, an associate at Georgetown Law’s Center on Privacy and Technology, voiced similar concerns in a report he helped write in December evaluating the agency’s use of facial scans.
“Are there privacy protections in the contracts that D.H.S. has reached with the airlines?” Mr. Rudolph said in an interview. “Do they require the disposal of any data collected? Do they require audits? Are there use limitations to ensure that travelers’ photos aren’t used in ways they don’t expect? Without any enforceable rules, it’s too easy for D.H.S. to break those promises.”
Mr. Wagner, however, defended the program and said it “builds upon the processes that have taken place for many years.”
“Airlines are already collecting a lot of information from a traveler and providing that to C.B.P.: the reservation data, the manifest,” he said.
But biometric data, including scans of passengers’ faces and fingerprints, is among the most sensitive, according to privacy experts, because unlike other means of identification such as a Social Security number, it cannot be changed.
The face is a particularly sensitive identifier because “if someone has a camera, they can identify you by your face,” Mr. Frankle said. “You can be recognized even if you have no idea you’re being recognized.”
The program, which currently operates through four major airlines in international airports in Los Angeles, Detroit, Orlando and Atlanta, is not mandatory for passengers. But the airlines — Delta, Lufthansa, British Airways and JetBlue — have reported that a majority of passengers participate.
It comes as facial recognition technology has become both more widespread and more closely scrutinized.
Companies such as Apple and Citibank have leveraged the technology, and still more — including casinos, music festival organizers and retailers like Walmart — have used it to track customers and shoplifters.
Amazon has recently drawn condemnation for providing facial recognition software to law enforcement agencies, whose use of the technology has caused privacy and civil liberties groups to voice concerns about overzealous surveillance.
Questions about the technology’s accuracy have also arisen. According to the Georgetown report, federal data shows that the system used by Customs and Border Protection incorrectly rejects as many as one in 25 travelers using valid credentials. It also cited studies that showed that facial recognition algorithms fail more frequently in correctly identifying women and people of color. The department said it does not track how many people using forged credentials manage to get through the system.
A privacy report by Customs and Border Protection published in June 2017 when the program began said federal officials would conduct a privacy evaluation within one year to ensure that airlines were complying with “required privacy protections.” An agency spokeswoman, Jennifer Gabris, said last month that the evaluation had not been completed because the program had not achieved “interim operating capability.”
Ms. Gabris did not respond to inquiries asking what privacy protections are required of participating companies; representatives for participating airlines also declined to respond to similar inquiries.
The companies have since been left to come up with their own policies that contain varying degrees of privacy assurances. In statements, the airlines stressed that the program was being tested in limited trials.
But while they provide passengers with generalized privacy policies, only one, JetBlue, specifically outlines how it will protect customers’ biometric data, an update made after inquiries from The New York Times.
Further complicating the situation is the fact that the companies share passengers’ data with the technology vendors they have contracted to create the infrastructure that collects the information and sends it to federal officials. Those vendors — Vision-Box, SITA and NEC Corporation — each have their own privacy policies with differing levels of accessibility.
“This is not something that is hard to get right if your hands are clean, which only invites suspicion if you don’t get it right,” Mr. Frankle said. “This isn’t a place where you’re innocent until proven guilty. We should be skeptical until they assure us they’re not going to use this data for anything other than convenience in ways that are legally binding.”
Some airlines like JetBlue have already expressed interest in using facial recognition technology to customize passengers’ experiences, such as allowing gate agents to recognize customers and even one day determining travelers’ mood.
Despite the privacy concerns, the Traveler Verification Service shows signs of expanding.
As part of a package devised to secure the nation’s borders, President Trump signed an executive order last year telling federal officials to speed up completion of a larger biometric entry-exit system that encompasses the program. And Congress authorized $1 billion in 2016 toward the measure.
The airlines say they hope to introduce the program within the next year to at least another half-dozen airports, and federal officials updated privacy documents last month to allow for cruise lines to participate in the program.
Two senators, Edward J. Markey, Democrat of Massachusetts, and Mike Lee, Republican of Utah, have urged the Department of Homeland Security to stop the expansion of the larger effort until their privacy concerns are addressed.
In a letter sent to the agency in December, the senators said that the system “appears not to have the proper safeguards to prevent the spread of this data to third parties or other government agencies.”