Face-recognition software: Is this the end of anonymity for all of us?
Face-recognition software: Is this the end of anonymity
for all of us?
The software is already used for military surveillance,
by police to identify suspects - and on Facebook. Now the US government is in
the process of building the world's largest cache of face-recognition data,
with the goal of identifying every person in the country
By Kyle Chayka
Wednesday 23 April 2014
From 2008 to 2010, as Edward Snowden has revealed, the
National Security Agency (NSA) collaborated with the British Government
Communications Headquarters (GCHQ) to intercept the webcam footage of 1.8
million Yahoo users.
The agencies were analysing images that they downloaded
from webcams and scanning them for known terrorists who might be using the
service to communicate, matching faces from the footage to suspects with the
help of a new technology called face recognition.
The outcome was pure Kafka, with innocent people being
caught in the surveillance dragnet. In fact, in attempting to find faces, the
Pentagon's Optic Nerve program recorded webcam sex by its unknowing targets –
up to 11 per cent of the material the program collected was "undesirable
nudity" that employees were warned not to access, according to documents.
And that's just the beginning of what face-recognition technology might mean
for us in the digital era.
Over the past decade, face recognition has become a
fast-growing commercial industry, moving from its governmental origins into
everyday life. The technology is being pitched as an effective tool for
securely confirming identities. To some, face recognition sounds benign, even
convenient. Walk up to the international check - in at a German airport, gaze
up at a camera and walk into the country without ever needing to pull out a
passport – your image is on file, the camera knows who you are. Wander into a
retail store and be greeted with personalised product suggestions – the store's
network has a record of what you bought last time. Facebook already uses face
recognition to recommend which friends to tag in your photos.
But the technology has a dark side. The US government is
in the process of building the world's largest cache of face-recognition data,
with the goal of identifying every person in the country. The creation of such
a database would mean that anyone could be tracked wherever his or her face
appears, whether it's on a city street or in a mall.
Face-recognition systems have two components: an
algorithm and a database. The algorithm is a computer program that takes an
image of a face and deconstructs it into a series of landmarks and proportional
patterns – the distance between eye centres, for example. This process of
turning unique biological characteristics into quantifiable data is known as
biometrics.
Together, the facial data points create a "face-print"
that, like a fingerprint, is unique to each individual. Some faces are
described as open books; at a glance, a person can be "read". Face‑recognition
technology makes that metaphor literal. "We can extrapolate enough data
from the eye and nose region, from ear to ear, to build a demographic
profile" including an individual's age range, gender and ethnicity, says
Kevin Haskins, a business-development manager at the face-recognition company
Cognitec.
Face-prints are collected into databases, and a computer
program compares a new image or piece of footage with the database for matches.
Cognitec boasts a match accuracy rate of 98.75 per cent, an increase of more
than 20 per cent in the past decade. Facebook recently achieved 97.25 per cent
accuracy after acquiring biometrics company Face.com in 2012.
So far, the technology has its limits. "The layman
thinks that face recognition is out there and can catch you anytime, anywhere,
and your identity is not anonymous anymore," says Paul Schuepp, the co-founder
of Animetrics, a decade-old face-recognition company based in New Hampshire.
"We're not that perfect yet."
The lighting and angle of face images must be strictly
controlled to create a usable face-print. "Enrolment" is the slightly
Orwellian industry term for making a print and entering an individual into a
face-recognition database. "Good enrolment means getting a really good
photograph of the frontal face, looking straight on, seeing both eyes and both
ears," Schuepp says.
How face recognition is already being used hints at just
how pervasive it could become. It's being used on military bases to control who
has access to restricted areas. In Iraq and Afghanistan it was used to check
images of detainees against al-Qa'ida wanted lists. The police department in
Seattle is already applying the technology to identify suspects on video
footage.
The technology's presence is subtle and as it gets
integrated into devices that we already use, it will be easy to overlook. The
most dystopian example might be NameTag, a start-up that launched in February
promising to embed face recognition in wearable computers such as Google Glass.
The software would allow its users to look across a crowded bar and identify
the anonymous cutie they are scoping out. The controversial company also brags
that its product can identify sex offenders on sight.
As the scale of face recognition grows, there's a chance
that it could take its place in the technological landscape as seamlessly as
the iPhone. But to allow that to happen would mean ignoring the increasing
danger that it will be misused.
By licensing their technology to everyone from military
contractors to internet start-ups, companies such as Cognitec and Animetrics
are churning a global biometrics industry that will grow to $20bn (£12bn) by
2020, according to Janice Kephart, the founder of Siba (Secure Identity and
Biometrics Association). With funding from a coalition of face-recognition
businesses, Siba launched in February 2014 to "educate about the reality
of biometrics, bridging the gap between Washington and the industry", says
Kephart, who previously worked as a legal counsel to the 9/11 Commission.
Kephart believes biometric technology could have prevented the 9/11 attacks
(which, she says, "caused a surge" in the biometrics industry) and
Snowden's NSA leaks. She emphasises the technology's protective capabilities
rather than its potential for surveillance. "Consumers will begin to see
that biometrics delivers privacy and security at the same time," she says.
It's this pairing of seeming opposites that makes face
recognition so difficult to grapple with. By identifying individuals, it can
prevent people from being where they shouldn't be. Yet the profusion of
biometrics creates an inescapable security net, with little privacy and the
potential for serious mistakes, with dire consequences. An error in the
face-recognition system could cause the ultimate in identity theft, with a
Miley Cyrus lookalike dining on Miley's dime or a hacker giving your digital
passport (and citizenship) to a stranger.
This summer, the FBI is focusing on face recognition with
the fourth step of its Next Generation Identification (NGI) programme, a $1.2bn
initiative launched in 2008 to build the world's largest biometric database. By
2013, the database held 73 million fingerprints, 5.7 million palm prints, 8.1
million mug shots and 8,500 iris scans. Interfaces to access the system are
being provided free of charge to local law enforcement authorities.
Jennifer Lynch, staff attorney for the privacy-focused
Electronic Frontier Foundation (EFF), notes that there were at least 14 million
photographs in the NGI face-recognition database as of 2012. What's more, the
database makes no distinction between criminal biometrics and those collected
for civil-service jobs. "All of a sudden, your image that you uploaded for
a civil purpose to get a job is searched every time there's a criminal
query," Lynch says. "You could find yourself having to defend your
innocence."
In the private sector, efforts are being made to ensure
that face recognition isn't abused, but standards are vague. A 2012 Federal
Trade Commission report recommends that companies should obtain
"affirmative express consent before collecting or using biometric data
from facial images". Facebook collects face-prints by default, but users
can opt out of having their face-prints collected.
Technology entrepreneurs argue that passing strict laws
before face-recognition technology matures will hamper its growth. "I
don't think it's face recognition we want to pick on," Animetrics's
Schuepp says. He suggests that the technology itself is not the problem;
rather, it's how the biometrics data [is] controlled.
Yet precedents for biometric surveillance must be set
early in order to control its application. "I would like to see regulation
of this before it goes too far," Lynch says. "There should be laws to
prevent misuse of biometric data by the government and by private companies. We
should decide whether we want to be able to track people through society or
not."
What would a world look like with comprehensive biometric
surveillance? "If cameras connected to databases can do face recognition,
it will become impossible to be anonymous in society," Lynch says. In the
future, the government could know when you use your computer, which buildings
you enter on a daily basis, where you shop and where you drive. It's the
ultimate fulfilment of Big Brother paranoia.
But anonymity isn't going quietly. Over the past several
years, mass protests have disrupted governments in countries across the globe,
including Egypt, Syria and Ukraine. "It's important to go out in society
and be anonymous," Lynch says. But face recognition could make that
impossible. A protester in a crowd could be identified and fired from a job the
next day, never knowing why. A mistaken face-print algorithm could mark the
wrong people as criminals and force them to escape the spectre of their own
image.
If biometric surveillance is allowed to proliferate
unchecked, the only option left is to protect yourself from it. Artist Zach
Blas has made a series of bulbous masks, aptly named the "Facial
Weaponisation Suite", that prepare us for just such a world. The
neon-coloured masks disguise the wearer and make the rest of us more aware of
how our faces are being politicised.
"These technologies are being developed by police
and the military to criminalise large chunks of the population," Blas says
of biometrics. If cameras can tell a person's identity, background and
whereabouts, what's to stop the algorithms from making the same mistakes as
governmental authorities, giving racist or sexist biases a machine-driven excuse?
"Visibility," he says, "is a kind of trap."
A version of this article appeared in 'Newsweek'
Comments
Post a Comment