Law in the Internet Society

View   r30  >  r29  >  r28  >  r27  >  r26  >  r25  ...
TWikiGuestFirstEssay 30 - 01 Nov 2021 - Main.RochishaTogare
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Changed:
<
<

>
>

A Growing Need to Protect Privacy in an Era of Growing Willingness to Give it Up

 
Changed:
<
<

:

>
>

The Advent of Privacy Challenges

 
Added:
>
>
Those of us born in the 90s remember the in-between; the shift of people carrying cellphones, to people carrying cellphones that could connect to the internet. Of one being able to use a bulky computer in a stationary place, to carrying around a laptop that let us take our work anywhere. To the only “social” being face-to-face meetings, to social being a word that finds its place before “media.”
 
Added:
>
>
We look at our current debates with privacy and think, “this is because of the internet revolution.” But in fact, right to privacy is alluded to from the very advent of our nation. The U.S. Constitution, as interpreted by the Supreme Court, recognizes a right to privacy in multiple amendments. Further, the first article addressing the privacy was by Justice Louis Brandeis in his 1890 Harvard Law Review article, stemming from the advent of photography and newspaper invasion into individuals’ homes. 1948 Saw the U.N. Declaration of Human Rights address privacy, and soon after in 1960, legal Scholar William Prosser “outlined four torts that would allow someone whose privacy was violated…to sue the perpetrator for damages.” (1)
 
Deleted:
<
<

 
Added:
>
>

The Modern Issues

 
Added:
>
>
In the past, such concerns were largely driven by individuals not having control over the actions of others—of the press taking photos, of the government invading their homes. However, in today’s age the concern is individuals’ own ignorance or willingness to forgo privacy for service. In an era of programmatic, targeted advertising, it’s easy to give up our names, ages, emails, and phone numbers, for the convenience and range of services that make life easier, often with the added allure of such services being free.

Earlier this month, former Facebook employee France Haugen released files revealing the results of the company’s internal research results regarding the impact of Instagram on teenage girls. A key statistic that has been highlighted in the media is that “32 perfect of teen girls said that when they felt bad about their bodies, Instagram made them feel worse” (2). One solution addresses that children under thirteen aren’t even supposed to be making accounts, because data collection on children under that age goes against our country’s privacy laws. Yet, I know many of my classmates signed up for Facebook before they were thirteen with fake birthdays. Facebook also mentioned a potential to create “Instagram Kids.”

Similarly, humans invariably offer up their data. Sometimes due simply to being unaware of what they’re revealing by doing so (as with the military base that was revealed when soldiers decided to compete with each other, uploading their fitness tracker data in the process and creating a map of their exercise route). In other ways, we do so for convenience, as with the FreeStyle? Libre sensors that have been using AI to recommend personalized diets based on individual’s glucose levels (4).

Attempts at Solving The Issue

Apple created a lot of buzz (and some very creative advertising campaigns) when they released a pop-up window that notifies users that an app is tracking their data, allowing users to prevent the app from doing so. (3) Many small businesses and apps were upset by the change, arguing that this was how they allowed users to access their services for free. Facebook responded saying that it was attempting to create a method of advertising that doesn’t rely on user data (3). But is it really that easy to dismantle a $350 billion digital industry? These companies have different views of how much they should roll back such advertising.

While BigTech? attempts to revamp their own privacy systems, can and should users do more to take privacy into their own hands? I’m positive that many people would rather use an app for free than pay to remove advertising (as evidences by the numerous app-store complaints when apps roll out pay-for-no-ads versions of their products). There has been a growing industry of products that market themselves as shirking ads (for example Brave, the private web browser), but how many people choose to use this service?

Furthermore, what is the state of media literacy in our country? One of the first ways we can protect young children who will undeniably sign up for these enticing social media services is to inform them about what they give up in exchange for access to endless streams of videos, 150-word posts, and their friends’ photos.

In the long run, I would argue that this education is a must if we’re to convince people to pay for subscription fees in lieu of paying for such services with their data.

(1) https://safecomputing.umich.edu/privacy/history-of-privacy-timeline

(2) https://www.nytimes.com/2021/10/13/parenting/instagram-teen-girls-body-image.html

(3) https://www.nytimes.com/2021/09/16/technology/digital-privacy.html

(4) https://www.theguardian.com/lifeandstyle/2021/oct/05/intimate-data-can-a-person-who-tracks-their-steps-sleep-and-food-ever-truly-be-free

  \ No newline at end of file

TWikiGuestFirstEssay 29 - 26 Oct 2021 - Main.KatharinaRogosch
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
A Growing Concern For Privacy
 
Deleted:
<
<

What

 
Added:
>
>

 
Deleted:
<
<
In the modern world of technology, where internet mammoths such as Google and Facebook, collect large amounts of personal data, the regulation of the collection of such data is essential. The interconnected relationship between data and individuals’ privacy over their own data needs to be examined to understand whether the current framework can achieve its own aims. This requires a two-set analysis: first, an examination of the regulation of data privacy and whether the standards imposed actually result in said protection; and secondly, an evaluation as to whether privacy should be protected by other means that it currently is.
 
Changed:
<
<
To aid in this analysis, the European General Data Protection Regulation (hereinafter “GDPR”) will be examined. This is due to the fact that it is one of the strictest data protection laws enacted worldwide, and an examination of such a strict privacy and data-protection standard should provide clarity as to whether adequate privacy protections have been achieved.
>
>

:

 
Deleted:
<
<

General Data Protection Regulation:

 
Deleted:
<
<
Within the European Union data protection is secured and regulated by the General Data Protection Regulation. The GDPR aims “to give citizens and residents of the European Union and the European Economic Area control over their personal data, and to simplify the regulatory environment for international business by fully harmonizing the national data laws of its member states”. However, the GDPR does not only concern privacy, rather its objectives relate to the “fundamental rights and freedoms of natural persons” surrounding the “processing and free movement of personal data”. Consequently, the GDPR also aims to address the rising power of Big Data practices and the “economic imbalance between [these companies] on one hand and consumers on the other”.
 
Changed:
<
<
The GDPR addresses the power imbalance between data controllers, who derive significant commercial benefit from the use of data, and users who bear significant harms associated with the usage of their own data. The legislation does this by placing explicit consent and anonymization techniques at the core of data processing. However, by focusing on these two specific aspects, the European legislators construct “structural regulatory spaces that fail to understand the ongoing challenges in delivering acceptable and effective regulation”. By exclusively concentrating on consent and anonymization techniques as a way to ensure data privacy and security, the GDPR fails to address not only the issues these concepts create but also how these should be implemented by app developers.
>
>

 
Deleted:
<
<
There are two issues created by the GDPR regulation, and that consequently significantly affect individual users’ privacy and data. Firstly, by using individuals’ consent as the gatekeeper to the legal processing of data, the GDPR places heavy emphasis on internet platforms themselves to fulfill the necessary GDPR standards. While simply obtaining users’ consent to the processing of their personal data does not make the processing of such data lawful, the fact that it is up to internet organizations themselves to implement adequate privacy standards says very little in terms of the protection that such standards afford in reality. Secondly, the GDPR stipulates that when data is anonymized, the need for explicit consent of the processing of the collected data is no longer required. At its core, by placing emphasis on anonymization techniques, the GDPR aims to reduce harmful forms of identification by preventing the singling out of natural persons and their personal information. However, as Narayanan and Shmitikov’s Paper on De-anonymization of Large Datasets and Oberhauses’s article on anonymous browsing data underline, de-anonymization of large data sets is standard industry practice for a majority of internet platforms.

Is the GDPR the right standard for privacy protection?

As outlined above, there are several issues associated with using the GDPR as the standard for privacy protection, the two biggest ones being treating consent as the standard for privacy, and the ability to de-anonymize data. Despite these issues, there are a number of benefits associated with using GDPR as the standard for data protection, namely that it functions in what Profesor Moglen as part of his “The Union, May it Be Preserved” speech in a transactional sphere. While Professor Moglen sees this as a problematic quality of the GDPR, the fact that the GDPR functions as a transaction where users consent to collection and usage of their data as a “transaction” for which they receive the benefit of accessing internet platforms means that the regulation can easily be implemented by any type of corporation. The issue with the GDPR is that the standards of implementation are too lax, and upon drafting the GDPR in 2018 the impact of de-anonymization technologies was not sufficiently considered. One could argue that if amendments were implemented into the GDPR that would tackle the issues of de-anonymization technologies the current privacy issues would be adequately addressed. However, such an argument fails to address the fundamental power imbalance created by internet platforms such as Google, Yahoo, and Facebook, where individual users are not given a choice as to how their data is processed.

Instead of working within the confines of the GDPR as it exists currently, Professor Moglen argues that we need to challenge our basic assumption that privacy and our data is part of the “transaction”. To some extent this idea has merit, in that why should our own personal data be a transactional token by which our privacy is achieved? In this sense, Professor Moglen’s definition of privacy as “ecological” and “relational among people” rather than an issue of individual consent is one that seems to provide a stricter standard of privacy protection. While an ecological conception of privacy could provide a much stricter standard of individuals’ data protection, the means of achieving such protection are less concrete. Namely, what standard of privacy is going to be the baseline to which all protection is measured (if an ecological protection of privacy is adopted akin to environmental protection)?

 

\ No newline at end of file


TWikiGuestFirstEssay 28 - 25 Oct 2021 - Main.RochishaTogare
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Added:
>
>
A Growing Concern For Privacy
 
Changed:
<
<

Does the GDPR adequately protect individuals' privacy?

>
>

What

 

In the modern world of technology, where internet mammoths such as Google and Facebook, collect large amounts of personal data, the regulation of the collection of such data is essential. The interconnected relationship between data and individuals’ privacy over their own data needs to be examined to understand whether the current framework can achieve its own aims. This requires a two-set analysis: first, an examination of the regulation of data privacy and whether the standards imposed actually result in said protection; and secondly, an evaluation as to whether privacy should be protected by other means that it currently is.


TWikiGuestFirstEssay 27 - 23 Oct 2021 - Main.KatharinaRogosch
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<
The Matrix: A Non-Fiction
>
>
 
Deleted:
<
<
Surveillance is defined by the Merriam-Webster dictionary as a close watch kept over someone or something (as by a detective). Traditionally, it was an invasion of privacy with the goal of exposing illicit activities.Surveillance was always associated with feelings of fear, anxiety, stress, and distrust. In the past century, a new form of surveillance has emerged.As Shoshana Zuboff described it, surveillance Capitalism depends on exploiting and controlling human nature. Companies like Google and Facebook extract information from us, and employ it to re-design our behavior with the goal of profit maximization. Quite simply, these companies are using technology not only to invade our privacy, but also to gradually, slightly, imperceptibly change our own behavior and perception. The technologies being used target our conscious and subconscious behavior and emotions. They tap into our desires and manipulate us into reacting the way they want us to. We have willfully surrendered our free will and agreed to be manipulated and engaged rather than actively making free undirected choices. Their goal is to automate us.Furthermore, surveillance capitalism is even being used by governments to control societies as a whole by affecting elections, suppressing opposition, and directing the population to adopt the government's way of thinking. An example of that would be Russia's use of Facebook and Twitter in 2018 by creating accounts and spreading polarizing misinformation, in order to manipulate Americans into casting their votes for Donald Trump.
 
Changed:
<
<
The Matrix is a science fiction movie about humans trapped in a simulation being controlled by machines, while other humans play with the switches. In the world of surveillance capitalism, is the Matrix still a science fiction?
>
>

Does the GDPR adequately protect individuals' privacy?

 
Deleted:
<
<
Why do we "trust this device"?
 
Changed:
<
<
It is no longer a secret that these tools are being used to surveil us and modify our behavior in order to maximize company profits. Yet somehow, even when Mark Zuckerberg is testifying before the Senate post the Cambridge Analytica scandal, Facebook was still making billions.We do not fear this surveillance because the tools it uses are attractive objects, give us a false of control, and have embedded themselves into our existence. Fear is an emotional response largely motivated by what we perceive threatens our existence.The tools that are surveilling us are purposefully designed in a way that attracts us to it. They use the human innate attraction to beauty.The nature of tech today has made the user experience and user interface design more important than ever before. The products are far more elegant than what they used to be. They focus on colors, shapes, clicks, feel, and ease of use to make the product more appealing to the senses. We also perceive these elegant tools as harmless immovable objects, incapable of threatening our existence.
>
>
In the modern world of technology, where internet mammoths such as Google and Facebook, collect large amounts of personal data, the regulation of the collection of such data is essential. The interconnected relationship between data and individuals’ privacy over their own data needs to be examined to understand whether the current framework can achieve its own aims. This requires a two-set analysis: first, an examination of the regulation of data privacy and whether the standards imposed actually result in said protection; and secondly, an evaluation as to whether privacy should be protected by other means that it currently is.
 
Changed:
<
<
Furthermore, we are told that these tools are there to serve us, giving us a sense of control. Meanwhile we have become preys to these tools which are designed to intentionally get us addicted and stripping us of actual control.Stanford University has a persuasive design lab which purpose is to teach the art of persuasion to its engineers and product designers including strategies such as placing ‘hot triggers’ in the path of motivated users. Such hot triggers would be colourful icons which glow with a light pulse when notifications remain unread, a smartwatch poking you ensuring you don't miss an update, or a "next episode" box on Netflix. Even though we know it's time to go to bed, we don't do turn off the tv and let the next episodes play automatically. The timer that they place before the next episode automatically plays is placed there to give us a sense of control. We, the users of email, social media, health apps, and smartphones are in a continuous state of distraction. Without knowing why, we find ourselves on social media and unintentionally jumping from one platform to the other.
>
>
To aid in this analysis, the European General Data Protection Regulation (hereinafter “GDPR”) will be examined. This is due to the fact that it is one of the strictest data protection laws enacted worldwide, and an examination of such a strict privacy and data-protection standard should provide clarity as to whether adequate privacy protections have been achieved.
 
Deleted:
<
<
Finally, these tools are embedded into our daily lives and we have relied on them enormously that we are unable to envision an alternative. Google, Facebook, Apple, et al want to render the choices they want us to make easier, and the choices they don’t want us to make harder, or inexistent.These tools are the new norm and we do not fear what we know, or think we know. We genuinely that we cannot function, keep track of our events, find a date, find a job, have a social life, listen to music, stay healthy without these tools. We have thus surrendered into a fascist way of thinking where we don't question things if they are working.
 
Changed:
<
<
Getting Out of the Matrix
>
>

General Data Protection Regulation:

 
Deleted:
<
<
We need to start by being aware of the reality of things. This attractive "object" has taken on the form of a physiological nervous system capable of creeping into our conscious and subconscious mind and manipulate our behavior. This "being" is a frightening threat to humans and our freedom of thought which should activate our defense mechanism and response. We must educate ourselves and those around us that there are other alternatives. We can use technology which allows us to live freely.Most importantly, we must educate the generations which are growing up believing that this physiological nervous system is their security blanket. We must teach them how to code and they fight it from the inside.
 
Changed:
<
<
Once we are alert and aware, we must take actual control by push backing instead of being pushed around. We refuse to be the submissive, passive, engaged victims of these tools. We can start by not swiping up for advertisements, turning off all notifications, not watching another episode, and gradually decreasing our interaction with it. Our time, attention, and freedom of choice are invaluable and we must protect them.Get out of the Matrix.
>
>
Within the European Union data protection is secured and regulated by the General Data Protection Regulation. The GDPR aims “to give citizens and residents of the European Union and the European Economic Area control over their personal data, and to simplify the regulatory environment for international business by fully harmonizing the national data laws of its member states”. However, the GDPR does not only concern privacy, rather its objectives relate to the “fundamental rights and freedoms of natural persons” surrounding the “processing and free movement of personal data”. Consequently, the GDPR also aims to address the rising power of Big Data practices and the “economic imbalance between [these companies] on one hand and consumers on the other”.

The GDPR addresses the power imbalance between data controllers, who derive significant commercial benefit from the use of data, and users who bear significant harms associated with the usage of their own data. The legislation does this by placing explicit consent and anonymization techniques at the core of data processing. However, by focusing on these two specific aspects, the European legislators construct “structural regulatory spaces that fail to understand the ongoing challenges in delivering acceptable and effective regulation”. By exclusively concentrating on consent and anonymization techniques as a way to ensure data privacy and security, the GDPR fails to address not only the issues these concepts create but also how these should be implemented by app developers.

There are two issues created by the GDPR regulation, and that consequently significantly affect individual users’ privacy and data. Firstly, by using individuals’ consent as the gatekeeper to the legal processing of data, the GDPR places heavy emphasis on internet platforms themselves to fulfill the necessary GDPR standards. While simply obtaining users’ consent to the processing of their personal data does not make the processing of such data lawful, the fact that it is up to internet organizations themselves to implement adequate privacy standards says very little in terms of the protection that such standards afford in reality. Secondly, the GDPR stipulates that when data is anonymized, the need for explicit consent of the processing of the collected data is no longer required. At its core, by placing emphasis on anonymization techniques, the GDPR aims to reduce harmful forms of identification by preventing the singling out of natural persons and their personal information. However, as Narayanan and Shmitikov’s Paper on De-anonymization of Large Datasets and Oberhauses’s article on anonymous browsing data underline, de-anonymization of large data sets is standard industry practice for a majority of internet platforms.

Is the GDPR the right standard for privacy protection?

As outlined above, there are several issues associated with using the GDPR as the standard for privacy protection, the two biggest ones being treating consent as the standard for privacy, and the ability to de-anonymize data. Despite these issues, there are a number of benefits associated with using GDPR as the standard for data protection, namely that it functions in what Profesor Moglen as part of his “The Union, May it Be Preserved” speech in a transactional sphere. While Professor Moglen sees this as a problematic quality of the GDPR, the fact that the GDPR functions as a transaction where users consent to collection and usage of their data as a “transaction” for which they receive the benefit of accessing internet platforms means that the regulation can easily be implemented by any type of corporation. The issue with the GDPR is that the standards of implementation are too lax, and upon drafting the GDPR in 2018 the impact of de-anonymization technologies was not sufficiently considered. One could argue that if amendments were implemented into the GDPR that would tackle the issues of de-anonymization technologies the current privacy issues would be adequately addressed. However, such an argument fails to address the fundamental power imbalance created by internet platforms such as Google, Yahoo, and Facebook, where individual users are not given a choice as to how their data is processed.

Instead of working within the confines of the GDPR as it exists currently, Professor Moglen argues that we need to challenge our basic assumption that privacy and our data is part of the “transaction”. To some extent this idea has merit, in that why should our own personal data be a transactional token by which our privacy is achieved? In this sense, Professor Moglen’s definition of privacy as “ecological” and “relational among people” rather than an issue of individual consent is one that seems to provide a stricter standard of privacy protection. While an ecological conception of privacy could provide a much stricter standard of individuals’ data protection, the means of achieving such protection are less concrete. Namely, what standard of privacy is going to be the baseline to which all protection is measured (if an ecological protection of privacy is adopted akin to environmental protection)?


TWikiGuestFirstEssay 26 - 22 Oct 2021 - Main.NathalieNoura
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
The Matrix: A Non-Fiction
Added:
>
>
 Surveillance is defined by the Merriam-Webster dictionary as a close watch kept over someone or something (as by a detective). Traditionally, it was an invasion of privacy with the goal of exposing illicit activities.Surveillance was always associated with feelings of fear, anxiety, stress, and distrust. In the past century, a new form of surveillance has emerged.As Shoshana Zuboff described it, surveillance Capitalism depends on exploiting and controlling human nature. Companies like Google and Facebook extract information from us, and employ it to re-design our behavior with the goal of profit maximization. Quite simply, these companies are using technology not only to invade our privacy, but also to gradually, slightly, imperceptibly change our own behavior and perception. The technologies being used target our conscious and subconscious behavior and emotions. They tap into our desires and manipulate us into reacting the way they want us to. We have willfully surrendered our free will and agreed to be manipulated and engaged rather than actively making free undirected choices. Their goal is to automate us.Furthermore, surveillance capitalism is even being used by governments to control societies as a whole by affecting elections, suppressing opposition, and directing the population to adopt the government's way of thinking. An example of that would be Russia's use of Facebook and Twitter in 2018 by creating accounts and spreading polarizing misinformation, in order to manipulate Americans into casting their votes for Donald Trump.

Revision 30r30 - 01 Nov 2021 - 09:57:43 - RochishaTogare
Revision 29r29 - 26 Oct 2021 - 04:50:01 - KatharinaRogosch
Revision 28r28 - 25 Oct 2021 - 20:57:22 - RochishaTogare
Revision 27r27 - 23 Oct 2021 - 00:22:22 - KatharinaRogosch
Revision 26r26 - 22 Oct 2021 - 20:48:47 - NathalieNoura
Revision 25r25 - 22 Oct 2021 - 20:34:23 - NathalieNoura
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM