Law in the Internet Society

View   r27  >  r26  ...
TWikiGuestFirstEssay 27 - 23 Oct 2021 - Main.KatharinaRogosch
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<
The Matrix: A Non-Fiction
>
>
 
Deleted:
<
<
Surveillance is defined by the Merriam-Webster dictionary as a close watch kept over someone or something (as by a detective). Traditionally, it was an invasion of privacy with the goal of exposing illicit activities.Surveillance was always associated with feelings of fear, anxiety, stress, and distrust. In the past century, a new form of surveillance has emerged.As Shoshana Zuboff described it, surveillance Capitalism depends on exploiting and controlling human nature. Companies like Google and Facebook extract information from us, and employ it to re-design our behavior with the goal of profit maximization. Quite simply, these companies are using technology not only to invade our privacy, but also to gradually, slightly, imperceptibly change our own behavior and perception. The technologies being used target our conscious and subconscious behavior and emotions. They tap into our desires and manipulate us into reacting the way they want us to. We have willfully surrendered our free will and agreed to be manipulated and engaged rather than actively making free undirected choices. Their goal is to automate us.Furthermore, surveillance capitalism is even being used by governments to control societies as a whole by affecting elections, suppressing opposition, and directing the population to adopt the government's way of thinking. An example of that would be Russia's use of Facebook and Twitter in 2018 by creating accounts and spreading polarizing misinformation, in order to manipulate Americans into casting their votes for Donald Trump.
 
Changed:
<
<
The Matrix is a science fiction movie about humans trapped in a simulation being controlled by machines, while other humans play with the switches. In the world of surveillance capitalism, is the Matrix still a science fiction?
>
>

Does the GDPR adequately protect individuals' privacy?

 
Deleted:
<
<
Why do we "trust this device"?
 
Changed:
<
<
It is no longer a secret that these tools are being used to surveil us and modify our behavior in order to maximize company profits. Yet somehow, even when Mark Zuckerberg is testifying before the Senate post the Cambridge Analytica scandal, Facebook was still making billions.We do not fear this surveillance because the tools it uses are attractive objects, give us a false of control, and have embedded themselves into our existence. Fear is an emotional response largely motivated by what we perceive threatens our existence.The tools that are surveilling us are purposefully designed in a way that attracts us to it. They use the human innate attraction to beauty.The nature of tech today has made the user experience and user interface design more important than ever before. The products are far more elegant than what they used to be. They focus on colors, shapes, clicks, feel, and ease of use to make the product more appealing to the senses. We also perceive these elegant tools as harmless immovable objects, incapable of threatening our existence.
>
>
In the modern world of technology, where internet mammoths such as Google and Facebook, collect large amounts of personal data, the regulation of the collection of such data is essential. The interconnected relationship between data and individuals’ privacy over their own data needs to be examined to understand whether the current framework can achieve its own aims. This requires a two-set analysis: first, an examination of the regulation of data privacy and whether the standards imposed actually result in said protection; and secondly, an evaluation as to whether privacy should be protected by other means that it currently is.
 
Changed:
<
<
Furthermore, we are told that these tools are there to serve us, giving us a sense of control. Meanwhile we have become preys to these tools which are designed to intentionally get us addicted and stripping us of actual control.Stanford University has a persuasive design lab which purpose is to teach the art of persuasion to its engineers and product designers including strategies such as placing ‘hot triggers’ in the path of motivated users. Such hot triggers would be colourful icons which glow with a light pulse when notifications remain unread, a smartwatch poking you ensuring you don't miss an update, or a "next episode" box on Netflix. Even though we know it's time to go to bed, we don't do turn off the tv and let the next episodes play automatically. The timer that they place before the next episode automatically plays is placed there to give us a sense of control. We, the users of email, social media, health apps, and smartphones are in a continuous state of distraction. Without knowing why, we find ourselves on social media and unintentionally jumping from one platform to the other.
>
>
To aid in this analysis, the European General Data Protection Regulation (hereinafter “GDPR”) will be examined. This is due to the fact that it is one of the strictest data protection laws enacted worldwide, and an examination of such a strict privacy and data-protection standard should provide clarity as to whether adequate privacy protections have been achieved.
 
Deleted:
<
<
Finally, these tools are embedded into our daily lives and we have relied on them enormously that we are unable to envision an alternative. Google, Facebook, Apple, et al want to render the choices they want us to make easier, and the choices they don’t want us to make harder, or inexistent.These tools are the new norm and we do not fear what we know, or think we know. We genuinely that we cannot function, keep track of our events, find a date, find a job, have a social life, listen to music, stay healthy without these tools. We have thus surrendered into a fascist way of thinking where we don't question things if they are working.
 
Changed:
<
<
Getting Out of the Matrix
>
>

General Data Protection Regulation:

 
Deleted:
<
<
We need to start by being aware of the reality of things. This attractive "object" has taken on the form of a physiological nervous system capable of creeping into our conscious and subconscious mind and manipulate our behavior. This "being" is a frightening threat to humans and our freedom of thought which should activate our defense mechanism and response. We must educate ourselves and those around us that there are other alternatives. We can use technology which allows us to live freely.Most importantly, we must educate the generations which are growing up believing that this physiological nervous system is their security blanket. We must teach them how to code and they fight it from the inside.
 
Changed:
<
<
Once we are alert and aware, we must take actual control by push backing instead of being pushed around. We refuse to be the submissive, passive, engaged victims of these tools. We can start by not swiping up for advertisements, turning off all notifications, not watching another episode, and gradually decreasing our interaction with it. Our time, attention, and freedom of choice are invaluable and we must protect them.Get out of the Matrix.
>
>
Within the European Union data protection is secured and regulated by the General Data Protection Regulation. The GDPR aims “to give citizens and residents of the European Union and the European Economic Area control over their personal data, and to simplify the regulatory environment for international business by fully harmonizing the national data laws of its member states”. However, the GDPR does not only concern privacy, rather its objectives relate to the “fundamental rights and freedoms of natural persons” surrounding the “processing and free movement of personal data”. Consequently, the GDPR also aims to address the rising power of Big Data practices and the “economic imbalance between [these companies] on one hand and consumers on the other”.

The GDPR addresses the power imbalance between data controllers, who derive significant commercial benefit from the use of data, and users who bear significant harms associated with the usage of their own data. The legislation does this by placing explicit consent and anonymization techniques at the core of data processing. However, by focusing on these two specific aspects, the European legislators construct “structural regulatory spaces that fail to understand the ongoing challenges in delivering acceptable and effective regulation”. By exclusively concentrating on consent and anonymization techniques as a way to ensure data privacy and security, the GDPR fails to address not only the issues these concepts create but also how these should be implemented by app developers.

There are two issues created by the GDPR regulation, and that consequently significantly affect individual users’ privacy and data. Firstly, by using individuals’ consent as the gatekeeper to the legal processing of data, the GDPR places heavy emphasis on internet platforms themselves to fulfill the necessary GDPR standards. While simply obtaining users’ consent to the processing of their personal data does not make the processing of such data lawful, the fact that it is up to internet organizations themselves to implement adequate privacy standards says very little in terms of the protection that such standards afford in reality. Secondly, the GDPR stipulates that when data is anonymized, the need for explicit consent of the processing of the collected data is no longer required. At its core, by placing emphasis on anonymization techniques, the GDPR aims to reduce harmful forms of identification by preventing the singling out of natural persons and their personal information. However, as Narayanan and Shmitikov’s Paper on De-anonymization of Large Datasets and Oberhauses’s article on anonymous browsing data underline, de-anonymization of large data sets is standard industry practice for a majority of internet platforms.

Is the GDPR the right standard for privacy protection?

As outlined above, there are several issues associated with using the GDPR as the standard for privacy protection, the two biggest ones being treating consent as the standard for privacy, and the ability to de-anonymize data. Despite these issues, there are a number of benefits associated with using GDPR as the standard for data protection, namely that it functions in what Profesor Moglen as part of his “The Union, May it Be Preserved” speech in a transactional sphere. While Professor Moglen sees this as a problematic quality of the GDPR, the fact that the GDPR functions as a transaction where users consent to collection and usage of their data as a “transaction” for which they receive the benefit of accessing internet platforms means that the regulation can easily be implemented by any type of corporation. The issue with the GDPR is that the standards of implementation are too lax, and upon drafting the GDPR in 2018 the impact of de-anonymization technologies was not sufficiently considered. One could argue that if amendments were implemented into the GDPR that would tackle the issues of de-anonymization technologies the current privacy issues would be adequately addressed. However, such an argument fails to address the fundamental power imbalance created by internet platforms such as Google, Yahoo, and Facebook, where individual users are not given a choice as to how their data is processed.

Instead of working within the confines of the GDPR as it exists currently, Professor Moglen argues that we need to challenge our basic assumption that privacy and our data is part of the “transaction”. To some extent this idea has merit, in that why should our own personal data be a transactional token by which our privacy is achieved? In this sense, Professor Moglen’s definition of privacy as “ecological” and “relational among people” rather than an issue of individual consent is one that seems to provide a stricter standard of privacy protection. While an ecological conception of privacy could provide a much stricter standard of individuals’ data protection, the means of achieving such protection are less concrete. Namely, what standard of privacy is going to be the baseline to which all protection is measured (if an ecological protection of privacy is adopted akin to environmental protection)?


Revision 27r27 - 23 Oct 2021 - 00:22:22 - KatharinaRogosch
Revision 26r26 - 22 Oct 2021 - 20:48:47 - NathalieNoura
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM