Computers, Privacy & the Constitution

View   r1
CorinneShimFirstPaper 1 - 22 Mar 2017 - Main.CorinneShim
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstPaper"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

An Introduction to how Personal Privacy and Personal Autonomy Have Changed with Technology

-- By CorinneShim - 22 Mar 2017

Introduction

We currently live in a society where individual privacy and individual autonomy are at great risk due to the ways in which we as a society interact with digital devices and data. In this paper, I will introduce how data mining intrudes on personal privacy, how algorithms intrude on personal autonomy, and possible ways to address these problems.

Personal Privacy

Personal information given to third parties has historically not been protected under the Fourth Amendment, but the scope of this has vastly increased due to developments in technology, and personal privacy is now in danger of becoming obsolete. The Supreme Court has “repeated held that the Fourth Amendment does not prohibit the obtaining of information revealed to a third party and conveyed by him to Government authorities, even if the information is revealed on the assumption that it will be used only for a limited purpose and the confidence placed in the third party will not be betrayed” (United States v. Miller, 1976). If the third party has limited information on an individual, this is not so problematic. In the world of comprehensive computing and communication data collection, this is extremely problematic: behaviors ranging from how long you spend on a website to the content of all your purchasing data can be collected by companies to create profiles of individual behaviors and habits. For a company focused on selling products, creating profiles or selling profiles of consumer behavior as a more efficient way of ad-targeting. There is a record of everything you have ever bought through Amazon, a record of everything you have written and deleted on Facebook, a record of website you have visited, and that none of these records are protected. In June 2016, a Virginia District Court stated that there was no reasonable expectation of privacy in an IP address when using the Internet and that if the government found the IP address on a child pornography site, they could use it to hack his computer (United States v. Matish, E.D. Va. June 2, 2016, part of the Playpen cases, where the FBI was able to hack into child pornography website and obtain over a thousand IP addresses of users after getting a search warrant).

There is, however, some hope: on February 1, 2017, Edina Police Detective David Lindman was granted “a search warrant to obtain the names, email addresses, account information and IP addresses of everyone in the entire town of 50,000 who had searched for any variation of [Douglas Junker] between Dec. 1, 2016, and Jan. 7, 2017,” including “name(s), address(es), telephone number(s), dates of birth, Social Security numbers, email addresses, payment information, account information, IP addresses, and MAC addresses of the person(s) who requested/completed the search.” This request is a clear over-reach of anything United States v. Miller or the Fourth Amendment could have meant, and may be the shock needed to provide a forum for re-evaluating our right to personal privacy as a society.

Personal Autonomy

Technology has influenced personal autonomy in many ways, one of which is by creating and reinforcing human prejudices through software algorithms. Software algorithms are a product of programmer’s beliefs. What this means is that algorithms can sometimes reflect unknown biases, with unintended consequences. For example, a Carnegie Mellon study found that Google’s advertising system showed an ad for high-income jobs six times as often for males as it did females (“Automated Experiments on Ad Privacy Settings”). It is probably true that the algorithm was not meant to disproportionately favor showing males high-income jobs, but nevertheless, this is what it is doing.

In some cases, there are intended consequences: Companies like Amazon like to use algorithms to increase consumer spending (Quartz, “Amazon’s algorithms are misleading customers and causing them to spend way more than they should”). Facebook once ran a week-long experiment that influenced the emotions of over 600,000 users (Forbes, “Facebook Manipulated 689,003 Users' Emotions For Science”). As a profit driven decision, the ethics can seem trivial. As a social experiment, the ethics are controversial. But imagine, for a moment, that instead of focusing on consumer purchasing behavior or consumer emotional behavior, we were to combine the two and perhaps influence a political contribution by making somebody happier with one candidate over another. This is where our technology is currently going. That humans are influenced by their environment is not a problem. That we are unaware of or willingly being influenced by institutions not looking out for our best interests is a huge problem, possibly even bigger than our loss of personal privacy as a society.

Possible Frameworks for Addressing these Concerns

The first framework that comes to my mind is to identify individual rights that must not be violated, and then to create laws protecting those rights with respect to technology. The courts have done this in some areas, such as the Student Protection Privacy Act. The search warrant granted to Detective Lindman shows we have not created very many protections; perhaps we can create some statutory limitations on police power. Facebook has demonstrated that individual consent can allow them to run large-scale experiments that in the analog world would require hundreds of thousands of individual consent forms. Perhaps we could consider forcing a more tangible form of consent for companies to give users.

A second possibility is to identify certain values within society, and to create legal precedents that allow for those values to be protected. The relatively recent arrival of computing and communication technology means that there is still room for discussion about such things as when it is appropriate to have government surveillance. The Playpen cases show that even with the same facts, courts are deciding differently. For example, United States v. Michaud was dropped because the government did not wish to disclose their method of finding users’ real IP addresses and filed for a motion to dismiss. However, in United States v. Tippens, another Playpen case, with the same judge, the case is moving forward.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 1r1 - 22 Mar 2017 - 14:40:56 - CorinneShim
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM