Computers, Privacy & the Constitution

View   r2  >  r1  ...
EileenHutchinsonSecondPaper 2 - 12 May 2013 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"

Say Cheese: Biometrics and Facial Recognition Technology

Line: 21 to 21
 The perils of facial recognition, however, are severe and the time to address those dangers is already passing. By its nature, biometric information – unlike addresses, SSNs and names – is unique, unchangeable (without resort to dangerous and expensive medical procedures) and non-duplicative. Facial recognition specifically creates privacy concerns that fingerprints do not. While anyone could dust a scene for fingerprints, the use of a faceprint can yield a subject’s name and social media accounts; the subject can be traced in the street and online, his friends could be tracked – all this without the knowledge of the subject. Although the FBI has stated that facial recognition use will be limited to “criminal justice purposes,” Senator Al Franken pointed out in a hearing that the FBI used photographs taken at political rallies (for President Barack Obama and former Secretary of State Hilary Clinton) in a presentation of the technology. The FBI’s history of domestic surveillance certainly doesn’t bode well for a narrow interpretation of “criminal justice purposes.” Since 2010, the National Institute of Justice has spent millions developing facial recognition binoculars to ID people at a distance and in crowds. It’s only too easy to imagine this kind of technology being used on innocent civilians and, especially, on individuals attending political protests. Authoritarian regimes have already demonstrating a willingness to use Facebook for oppressive purposes. As facial recognition technology improves in those countries, the danger of being one face among thousands at a political rally will increase exponentially.
Added:
>
>

This is an excellent account of the state of the art. It also correctly helps the reader to extrapolate the future in some respects. But you are somewhat limiting the forms of social control that are enabled by the Net's ability to recognize individual faces. You limit yourself to inherently exceptional examples: no matter who we are, we are mostly not a face at a political rally. Instead we are being recognized everywhere we go, by a network of machines whose overall purpose is to affect our behavior, either to sell us something, or to inhibit our disobedience or obstructive differentness. A little more thinking on the consequences of those developments would be in order.

 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

EileenHutchinsonSecondPaper 1 - 05 May 2013 - Main.EileenHutchinson
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="SecondPaper"

Say Cheese: Biometrics and Facial Recognition Technology

-- By EileenHutchinson - 05 May 2013

Government Use of Biometric Technology

For the past few years, federal, state and local governments have pushed for the development of multimodal biometric systems (systems that combine two or more biometrics – like fingerprints, palm prints, iris scans, facial imaging, scars etc.), which they argue will make identification systems more precise. The FBI, which has invested $1 billion in the multimodal biometrics project Next Generation Identification (NGI), has stated that it needs “to collect as much biometric data as possible…and to make this information accessible to all levels of law enforcement, including International agencies.” To that end, the FBI has already begun collection of what is referred to as “face-recognition ready photographs” in several pilot states. NGI represents a break from past practice in that non-criminal photographs will be added to the database. The FBI is currently working on a system in which all criminal and civil data will be linked to allow a single search to access all data. Furthermore, the resurgent immigration debate has reignited discussion of biometric ID card for citizens and non-citizens to verify their eligibility to work in the United States.

I had already begun research into the FBI’s NGI program a few weeks ago, when I came across an article in The Atlantic, “Why Hasn’t the FBI’s Facial Recognition Technology Found the Boston Bombers?” The article bemoaned the fact that NGI was thus far constrained to pilot states and explained what other methods the government would use to identify the bombing suspects. The FBI would likely run video stills against official passport, visa and driver’s license databases in addition to their own criminal databases; additionally it might borrow a play from the NYPD’s Facial Recognition Unit and use Facebook and Instagram to run suspect photos against its databases. What fascinated me about this article was the fact that, though NGI won’t be fully rolled out until 2014, the government and the private sector have already quietly joined forces on facial recognition technology and, what’s more, the public seems fine with it. The question of the hour seems to be, why doesn’t this technology work better?

Private Sector Use of Facial Recognition Technology

The use of facial recognition technology by private sector companies has grown dramatically in recent years, with Google, Apple and Facebook – the current king of facial recognition technology – joining in. In 2010, Facebook introduced the “Photo Tag Suggest” feature, which uses facial recognition technology and previously tagged photos to identify individuals in newly posted photos. As of October 2012, Facebook had roughly 220 billion photos in its collection, and roughly 300 million photos new photos are uploaded each day. A photo tag is actually a hyperlink to the user’s Facebook profile, and therefore if an individual can be identified via Photo Tag Suggest, all of his or her public information is available to the viewer. The “Photo Tag Suggest” feature was marketed as a tool to make sorting and tagging photos easier, and Facebook, unsurprisingly, did not make clear that the feature created a “faceprint” – a unique biometric based on the geometry of the face – for all users. Although Facebook alleges that users can opt out of system, most users are unaware that their biometric information has been extracted and stored, and the process of deleting one’s faceprint is sufficiently convoluted that most user’s will not successfully erase their information.

In 2011, Professor Alessandro Acquisiti of Carnegie Mellon University conducted a series of experiments focused on the convergence to two trends – the improvement of facial recognition technology and the increasing public availability of identified facial photos online. In Experiment 1, he team took unidentified profile photos from an online dating site where people use pseudonyms to protect privacy and compared them to photos from Facebook that could be accessed through a search engine (i.e. without logging into Facebook). Using facial recognition technology, Acquisiti’s team identified 10% of the individuals. In Experiment 2, his team took webcam photos of students on a college campus and compared those pictures to Facebook photos. The team identified 1/3 of the subjects in the experiment. Finally in Experiment 3, the team predicted the interests and first five digits of the Social Security numbers (SSNs) of some of the individuals who had participated in Experiment 2 using facial recognition technology and algorithims to predict SSNs from public data. In testimony before Congress, Acquisiti stated that barriers to more accurate recognition are transient, disappearing as facial recognition technology advances and social networks amass larger databases of identified photos. His experiments show that, already, the convergence of face recognition, online social networks and data mining has allowed sensitive inference to be extracted from an anonymous face.

The Trade-offs of Facial Recognition

The advancement of facial recognition technology is an unavoidable reality. Certainly there are some ways that facial recognition can improve our quality of life. Dangerous criminals may be more quickly identified and arrested. (If facial recognition had worked in the case of the Boston bombers, it seems likely that the suspects would have been arrested without notice, thus avoiding the murder of a police officer and the shutdown of the city of Boston.) Missing children could be identified and recovered. Personal security systems could be made more secure through the use of biometric identifiers.

The perils of facial recognition, however, are severe and the time to address those dangers is already passing. By its nature, biometric information – unlike addresses, SSNs and names – is unique, unchangeable (without resort to dangerous and expensive medical procedures) and non-duplicative. Facial recognition specifically creates privacy concerns that fingerprints do not. While anyone could dust a scene for fingerprints, the use of a faceprint can yield a subject’s name and social media accounts; the subject can be traced in the street and online, his friends could be tracked – all this without the knowledge of the subject. Although the FBI has stated that facial recognition use will be limited to “criminal justice purposes,” Senator Al Franken pointed out in a hearing that the FBI used photographs taken at political rallies (for President Barack Obama and former Secretary of State Hilary Clinton) in a presentation of the technology. The FBI’s history of domestic surveillance certainly doesn’t bode well for a narrow interpretation of “criminal justice purposes.” Since 2010, the National Institute of Justice has spent millions developing facial recognition binoculars to ID people at a distance and in crowds. It’s only too easy to imagine this kind of technology being used on innocent civilians and, especially, on individuals attending political protests. Authoritarian regimes have already demonstrating a willingness to use Facebook for oppressive purposes. As facial recognition technology improves in those countries, the danger of being one face among thousands at a political rally will increase exponentially.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 2r2 - 12 May 2013 - 22:41:35 - EbenMoglen
Revision 1r1 - 05 May 2013 - 21:41:49 - EileenHutchinson
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM