Index: [thread] [date] [subject] [author]
  From: Amanda Chapman <alc2139@columbia.edu>
  To  : <CPC@emoglen.law.columbia.edu>
  Date: Sat, 25 Feb 2006 02:00:23 -0500

Paper 1: Face/Off?

Paper 1 – Face/Off?

Although not a new concept, especially after the attacks on the
World Trade Center, face recognition technology has nevertheless
recently reached new levels of intrusion into personal privacy. The
scandals in past years over the biometric analysis of crowds at the
Tampa Superbowl, through public camera surveillance techniques
linked to criminal databases (Ybor City), and airport face
recognition systems (Boston, Providence, San Francisco and Fresno
to name a few) have had one thing in common; you, as a sports fan,
pedestrian or traveler must have been physically present for the
picture to be captured and compared to mugshots, drivers license
photos or other previously obtained surveillance images. One could
almost (though perhaps not convincingly) argue that you, as a
citizen of our digital age, by voluntarily venturing into places
where you know cameras will be operating, consent indirectly to the
use of such pictures, and take for granted that in return for this
intrusion into your personal privacy, your image will be used for
the prevention of suspicious or criminal activities. But, what if a
system existed that could biometrically identify you to anyone with
internet access without you even leaving your apartment?
Such a system does in fact exist, and comes in the form of state of
the art technology from a company called ‘Riya’. Riya’s technology
builds on current multimedia search techniques by adding meta tags
to images on the web (though currently this only extends to photos
uploaded specifically onto Riya’s servers). Until now, if someone
wanted to find a picture of me online, my name would have to be
included either in the file name or the picture’s tag for it to be
found. With Riya’s technology, once the software has access to
named images of me, it is then ‘trained’ to search out images
matching my biometric template from other uploaded pictures
(whether uploaded by me or someone else) even when my name does not
appear in the filename or the tag.
Although this may not sound particularly dangerous, if the
technology were extended to searches of the whole web, the system
could take a sinister turn. As Jennifer Granick, director of the
Stanford Law School Center for Internet and Society states, ‘in a
digital world, you do not know if someone is taking your picture –
with a camera, a webcam or a cell phone – and the image can be
stored forever and searched by people you do not know, at any point
in time, without your knowledge and at little cost to the searcher’.
 Although the technology could be used for innocuous reasons such as
finding out who your friend’s latest squeeze is, or laughing at the
embarrassing picture from college in which you are throwing up in a
corner, more serious consequences could result, certainly in the
business world and indeed the criminal world. Imagine your boss
being able to see these pictures too, or, as Granick predicts, your
life insurance company finding out you spent your last vacation
bungee jumping off a 400ft bridge! A stalker on a subway could take
a picture of a woman with his cell phone and upon getting home,
upload the picture, run a quick search and find out her identity,
her marital status, her workplace and who her friends and family
are, and a law school student taking part in a political protest
would find the subsequent photo being touted by the opposition when
she runs for Congress. Stills from CCTV footage of a street corner
drug deal could be uploaded and the identity of the individuals
found within seconds.  Politicians or public officials could be
embarrassingly ‘outed’ from footage of them in gay bars or with
call girls.
In sum, presuming the greater majority of the western world is
sufficiently computer literate to be able to upload a digital
picture, a potential online 'facesearch' database of millions of
people could be created, available to anyone, including police and
national security officials, for any reason whatsoever, legal or
illegal.
Therefore, returning to the previous proposition that your identity
could be revealed without you even leaving your apartment; so long
as someone had uploaded a picture of you with you name attached
(easy enough to imagine – think facebook, university student
directories or school leavers’ photographs), anyone could then type
your name into Riya’s system and dredge up other photos of you which
are unnamed and were taken in private without consent, even though
you may have been carefully walking around in public in a mask for
the past 10 years to prevent recognition from public surveillance
systems.
Riya’s circular defense to such concerns is that ‘if you don’t want
to be indexed, do not let anyone post photos of you’. However, as
mentioned above, this is nigh on impossible in today’s world of
discreet handheld image-capturing devices. It would take one
posting, either by an anonymous stranger or your best friend, even
if later removed at your request, for the technology to kick into
effect and find unnamed pictures of you posted by others.
Apart from the fact that such technology would have a terrible
impact on one’s personal privacy and integrity, another problem is
that of its potential (in)accuracy. An independent review of
products available to the government concluded that current
technology cannot be relied upon to pick an individual out of a
crowd, making it pretty ineffective for the apprehending of
potential criminals. Even 99.9% effectiveness would turn up 1000
false positives for every person searched for in a database of,
say, 10 million people. Using Riya’s system, that’s 1000 photos
supposedly of you which are in fact of someone else. Anyone running
a search on your name who does not know how you look (like a
potential employer) could easily attribute negative actions or
situations in those photos to you.
Technology such as this must be highly regulated if not completely
curbed, otherwise our last resort will be to follow in the
footsteps of Nicholas Cage and John Travolta by changing our faces,
making millionaires out of the recently successful French face
transplant surgeons. How would a state go about regulating a public
face search system? First of all, once this technology becomes
widely available for public use, people as a general rule will
become much more reticent about having their picture taken by
others. It will become socially unacceptable to click away without
the consent of the person whose picture you are taking, and
regulations would be enacted to prevent club promoters and
restaurant or bar owners from collecting pictures to post on their
web promotion pages without consent. Legislation could be pushed
through to make the uploading of unnamed pictures taken without
consent illegal or subject to stringent penalties. Whilst police
and national security databases would still be allowed to use face
recognition technology for the combating of crime, their databases
would not be available for general public use, and like domestic
wiretapping, a department would need a warrant before being able to
conduct a biometric search on a suspect. Similarly, once the
technology becomes more accurate, every international airport would
ideally have two areas for check in and immigration, with passengers
choosing when purchasing their ticket whether or not they would
allow themselves the luxury and speed of the ‘face scan’ area, or
the high security measures of the ‘standard’ area. We should aspire
to achieve a state of affairs where your face, just like your
signature, would become uniquely yours and exploitable by you and
you alone.

-----------------------------------------------------------------
Computers, Privacy, and the Constitution mailing list



Index: [thread] [date] [subject] [author]