I Opted Out of Facial Recognition at the Airport—It Wasn’t Easy

The announcement came as we began to board. Last month, I was at Detroit’s Metro Airport for a connecting flight to Southeast Asia. I listened as a Delta Air Lines staff member informed passengers that the boarding process would use facial recognition instead of passport scanners.

As a privacy-conscious person, I was uncomfortable boarding this way. I also knew I could opt out. Presumably, most of my fellow fliers did not: I didn’t hear a single announcement alerting passengers how to avoid the face scanners.

To figure out how to do so, I had to leave the boarding line, speak with a Delta representative at their information desk, get back in line, then request a passport scan when it was my turn to board. Federal agencies and airlines claim that facial recognition is an opt-out system, but my recent experience suggests they are incentivizing travelers to have their faces scanned—and disincentivizing them to sidestep the tech—by not clearly communicating alternative options. Last year, a Delta customer service representative reported that only 2 percent of customers opt out of facial-recognition. It’s easy to see why.

As I watched traveler after traveler stand in front of a facial scanner before boarding our flight, I had an eerie vision of a new privacy-invasive status quo. With our faces becoming yet another form of data to be collected, stored, and used, it seems we’re sleepwalking toward a hyper-surveilled environment, mollified by assurances that the process is undertaken in the name of security and convenience. I began to wonder: Will we only wake up once we no longer have the choice to opt out?

my review here
my sources
navigate here
navigate to these guys
navigate to this site
navigate to this web-site
navigate to this website
next page
no titleofficial site
official source
official statement
official website
on bing
on front page
on the main page
on yahoo
one-time offer
original site
our site
our website
over at this website
over here
pop over here
pop over to these guys
pop over to this site
pop over to this web-site
pop over to this website
published here
read full article
read full report
read here
read more here
read moreÂ…
read review
read the article
read the full info here
read this
read this article
read this post here
read what he said
recommended reading
recommended site
recommended you read
redirected here
related site
right here
secret info
see here

Until we have evidence that facial recognition is accurate and reliable—as opposed to simply convenient—travelers should avoid the technology where they can.

The facial recognition plan in US airports is built around the Customs and Border Protection Biometric Exit Program, which utilizes face-scanning technology to verify a traveler’s identity. CBP partners with airlines—including Delta, JetBlue, American Airlines, and others—to photograph each traveler while boarding. That image gets compared to one stored in a cloud-based photo-matching service populated with photos from visas, passports, or related immigration applications. The Biometric Exit Program is used in at least 17 airports, and a recently-released Department of Homeland Security report states that CBP anticipates having the ability to scan the faces of 97 percent of commercial air passengers departing the United States by 2023.

This rapid deployment of facial recognition in airports follows a 2017 executive order in which President Trump expedited former President Obama’s efforts to use biometric technology. The Transportation Security Administration has since unveiled its own plan to improve partnership with CBP and to introduce the technology throughout the airport. The opportunity for this kind of biometric collection infrastructure to feed into a broader system of mass surveillance is staggering, as is its ability to erode privacy.

Proponents of these programs often argue that facial recognition in airports promotes security while providing convenience. But abandoning privacy should not be a prerequisite for achieving security. And in the case of technology like facial recognition, the “solution” can quickly become a deep and troubling problem of its own.

For starters, facial recognition technology appears incapable of treating all passengers equally at this stage. Research shows that it is particularly unreliable for gender and racial minorities: one study, for example, found a 99 percent accuracy rate for white men, while the error rate for women who have darker skin reached up to 35 percent. This suggests that, for women and people of color, facial recognition could actually cause an increase in the likelihood to be unfairly targeted for additional screening measures.

Leave a Reply

Your email address will not be published.