Law in the Internet Society

View   r6  >  r5  >  r4  >  r3  >  r2  >  r1
CarlaDULACFirstEssay 6 - 22 Jan 2020 - Main.CarlaDULAC
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 15 to 15
 

If we were to ask quite wealthy people from our generation if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation.

Changed:
<
<
But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about. Personal data are any anonymous data that can be double checked to identify a specific individual.
>
>
But what they and I haven't realized when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about. Personal data are any anonymous data that can be double-checked to identify a specific individual.
 

Those data are precious for companies because they reveal our preferences, our likes that can be sold to other companies to increase their sell.

Line: 51 to 51
 GDPR can be used to erase online content, whether or not that content actually violates anyone else's rights. Such a use could constitute a violation of freedom of information with is part of the fundamental right of freedom of expression.Those rights are recognised in international law, as in the article 19 of the Universal Declaration of Human rights. It is true that there are some very important concerns about data protection and privacy in face of mass collection of our data by companies, but the way "right to be forgotten" is built is not appropriate. In fact freedom of expression for the public interest is essential, even for information obtained unlawfully.
Changed:
<
<
In my opinion, the right to be forgotten as designed by the GDPR is in total opposition with the freedom of expression. The right to ask search engines to de-index web pages, as well as the right of erasure, is encompassed in this legislation. What I think is problematic is that under article 17, it’s data controllers (usually search engines) which are the initial adjudicators of requests. This is problematic because search engines do not own the content that the individual is asking to have removed. Editorial decisions must rest with publishers — not tech companies. Otherwise, it can evolve into a form of censorship, since as individuals we don’t have control over what is removed. Furthermore, it is on our behalf, that states have a right to censor indexes, using deindexing orders. Once again it questions about censorship: states impose us a duty to forget those indexes. We can’t say anything. Why can’t we deal with our information the way we want? Why do we need a third part to be involved?
>
>
In my opinion, the right to be forgotten as designed by the GDPR is in total opposition with the freedom of expression. The right to ask search engines to de-index web pages, as well as the right of erasure, is encompassed in this legislation. What I think is problematic is that under article 17, it's data controllers (usually search engines) which are the initial adjudicators of requests. This is problematic because search engines do not own the content that the individual is asking to have removed. Editorial decisions must rest with publishers; not tech companies. A site does not have to be indexed (the process of downloading a site or a page's content to the server of the search engine, thereby adding it to its index), to be listed (showing a site in the search result pages). If the indexer decides to de-index a page, it does not remove the source content from the internet: It only mean the underlying website will not be listed in the search results. As a result, de indexing can be compared to a right to obscurity or a right to oblivion. Furthermore, it is on our behalf that states have a right to censor indexes, using deindexing orders. Once again it questions about censorship: states impose us a duty to forget those indexes. We can’t say anything. Why can’t we deal with our information the way we want? Why do we need a third part to be involved?
 
Changed:
<
<
What should be done in order to change that? EU law isn’t providing ways to effectively protect privacy and individuals often part with their information without knowing that they have surrendered some privacy.
>
>
What should be done in order to change that? EU law isn't providing ways to effectively protect privacy and individuals often part with their information without knowing that they have surrendered some privacy.
 The answer to the problem is maybe to forbid the right to be forgotten, since it does more harm than good. From a physiological or neurological point of view, no one can be forced to forget. Privacy is a very important right that must be protected, but there are limits. If you did something, you did it. If something was published, it cannot be unpublished. Another argument is to know where does the right to be forgotten fit into a world that functions through blockchain which is designed precisely to record everything permanently?
Line: 62 to 63
 
Deleted:
<
<

I'm not sure I understand this analysis. Does anyone have a right to be indexed? If the indexer makes the decision to de-index a page, how is that actionable? In which case, it is difficult to see how the free expression interest at stake is the interest of any individual speaker; the right of expression trenched upon is in all cases the indexer, is it not?

 



CarlaDULACFirstEssay 5 - 11 Jan 2020 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 64 to 64
 
Deleted:
<
<
 
Added:
>
>
I'm not sure I understand this analysis. Does anyone have a right to be indexed? If the indexer makes the decision to de-index a page, how is that actionable? In which case, it is difficult to see how the free expression interest at stake is the interest of any individual speaker; the right of expression trenched upon is in all cases the indexer, is it not?
 


CarlaDULACFirstEssay 4 - 02 Dec 2019 - Main.CarlaDULAC
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 13 to 13
 

My worries about having no say in the use of my personal data.

Deleted:
<
<
If we were to ask people from our generation

In the minority of the human race that is wealthy....

if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation. But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about.

 
Changed:
<
<
Any information related to us, in our private or professional life, that we share online, constitutes our personal data.

Not necessarily. The information might not have anything to do with you at all. Once again, you might want to distinguish between the information distributed and the fact that you are either disseminating or receiving it. And you might be more attentive to the receiving activity rather than the transmitting activity, precisely because if you are, it will destabilize the structure of the current draft's argument.

>
>
If we were to ask quite wealthy people from our generation if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation. But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about. Personal data are any anonymous data that can be double checked to identify a specific individual.
 

Those data are precious for companies because they reveal our preferences, our likes that can be sold to other companies to increase their sell.

Line: 54 to 34
 It seemed a good way to act posteriorly and find a way to correct our past mistakes. But can we really rely on it? It did not take long before we could see the limit of this regulation.

In fact, the ECJ issued a ruling on September 25, 2019, regarding the implementation of the EU's right to be forgotten.

Changed:
<
<
This dispute involved Google and France's data authority (CNIL). The ruling held that CNIL can compel Google to remove links to offending material for users that are in Europe, but can't do that worldwide. We should not be shocked with the ECJ ruling, it was foreseeable since a different ruling could have been viewed as an attempt by Europe to police an US tech giant beyond US borders.
>
>
This dispute involved Google and France's data authority (CNIL). The ruling held that CNIL can compel Google to remove links to offending material for users that are in Europe, but can't do that worldwide. We should not be shocked with the ECJ ruling,that denies all extraterritorial effect to any EU measures that want to implement the duty to forget coercively. It was foreseeable since a different ruling could have been viewed as an attempt by Europe to police an US tech giant inside US borders.
 
Deleted:
<
<
No. It upholds the right to police the US company beyond US borders. Perhaps you meant inside US borders. That is, to displace US law in the US. But the point is to deny all extraterritorial effects to EU orders coercively implementing the duty to forget or censoring indexes through deindexing orders. As I emphasized in class, SFLC.in, the Indian sister organization to my own Software Freedom Law Center, intervened in the EJC against CNIL, from a purely non-US perspective.
  When we read through the decision, we see another underlying problem: how can right to privacy and freedom of information work together. France asked the Court to extend the right to be forgotten universally to people outside the EU. Google argued that such a ruling may result in global censorship and infringement of freedom of information rights. The real issue at stake is: can we impose on other a duty to forget?
Line: 67 to 45
 
Changed:
<
<

What about freedom of information. What else can we do?

>
>

What about freedom of information in the context of censorship ?

 
Deleted:
<
<
As said before, one good aspect of GDPR is that it creates a fast process to erase the data that Internet companies collect and store for use in profiling and targeted advertising. But on other side it can be used to erase online content, whether or not that content actually violates anyone else's rights. Such a use could constitute a violation of freedom of information with is part of the fundamental right of freedom of expression. Those rights are recognised in international law, as in the article 19 of the Universal Declaration of Human rights. It is true that there are some very important concerns about data protection and privacy in face of mass collection of our data by companies, but the way "right to be forgotten" is built is not appropriate.
 
Changed:
<
<
First of all, freedom of expression for the public interest is essential, even for information obtained unlawfully. In fact, information may not have been accessible otherwise because it was kept secret, but for the good of the society, they have been revealed. Allowing a right to forget for those data would be harmful for the society. One other argument would be that allowing people to have some links related to their name delated could become a way to hide the truth and give a false picture of who they are. Imposing on people that are seeking information on other a duty to forget on some other's information is not the good answer. Individuals have a right to access all the information available, and past mistakes should not be forgotten, but used as examples. Maybe one answer to the problem would be to allow a right of correction, to reply that restricts less the freedom of expression, compares to the right to be forgotten. It would enable individuals to present themselves as they really are and correct false information about them.
>
>
GDPR can be used to erase online content, whether or not that content actually violates anyone else's rights. Such a use could constitute a violation of freedom of information with is part of the fundamental right of freedom of expression.Those rights are recognised in international law, as in the article 19 of the Universal Declaration of Human rights. It is true that there are some very important concerns about data protection and privacy in face of mass collection of our data by companies, but the way "right to be forgotten" is built is not appropriate. In fact freedom of expression for the public interest is essential, even for information obtained unlawfully.

In my opinion, the right to be forgotten as designed by the GDPR is in total opposition with the freedom of expression. The right to ask search engines to de-index web pages, as well as the right of erasure, is encompassed in this legislation. What I think is problematic is that under article 17, it’s data controllers (usually search engines) which are the initial adjudicators of requests. This is problematic because search engines do not own the content that the individual is asking to have removed. Editorial decisions must rest with publishers — not tech companies. Otherwise, it can evolve into a form of censorship, since as individuals we don’t have control over what is removed. Furthermore, it is on our behalf, that states have a right to censor indexes, using deindexing orders. Once again it questions about censorship: states impose us a duty to forget those indexes. We can’t say anything. Why can’t we deal with our information the way we want? Why do we need a third part to be involved?

What should be done in order to change that? EU law isn’t providing ways to effectively protect privacy and individuals often part with their information without knowing that they have surrendered some privacy. The answer to the problem is maybe to forbid the right to be forgotten, since it does more harm than good. From a physiological or neurological point of view, no one can be forced to forget. Privacy is a very important right that must be protected, but there are limits. If you did something, you did it. If something was published, it cannot be unpublished. Another argument is to know where does the right to be forgotten fit into a world that functions through blockchain which is designed precisely to record everything permanently?

 
Deleted:
<
<
Finally, instead of using a right to be forgotten that does more harm to freedom of expression, we could use other remedies such as going to court which will decide if the information will remain available to the public society or not.
 
Deleted:
<
<
This is still judicial censorship, is it not?
 
Deleted:
<
<
We could also use mechanisms available on social media platforms that helps to identify harmful content and then can be removed by those platforms.
 

Deleted:
<
<
The draft does a good job explaining existing European legal phenomena. It also clearly voices the basic free expression arguments that limit the use of censorship orders in the interest of privacy. But the draft lacks a clear idea of your own to be placed in dialogue with these exterior sets of ideas, which is why the conclusion find you basically throwing up your hands. Making the draft stronger means bringing your ideas to the front. You can say what you think, show how your idea emerges from your understanding of the two other points of view, then providing some real conclusion on which the reader can then base her own further thinking.

As I said in the early lines of the draft, by abstracting from the narrow "right to be forgotten" fact pattern, you can make progress. Censorship orders seem useful, whatever their problems in principal, only so far as we concern ourselves with searching: in the end, it's about the state's right to censor indexes on behalf of individuals. But if you start from the role of the same parties in surveiling what everyone reads, it is rapidly apparent that censoring indexes, or indeed censoring content directly, will not address the center of the problem at all.

 
Deleted:
<
<
Once it is apparent that the orders censoring indexes are a minor response to an infinitesimal part of the problem, one can no longer cast the issue as "right to be forgotten" against "free expression." That makes much more room for your own ideas.
 


CarlaDULACFirstEssay 3 - 24 Nov 2019 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 
Changed:
<
<

Privacy, Protection of our data: the right to be forgotten online versus freedom of information

>
>

Privacy, Protection of our data: the right to be forgotten online versus freedom of information

 

Line: 16 to 14
 

Changed:
<
<
If we were to ask people from our generation if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation.
>
>
If we were to ask people from our generation

In the minority of the human race that is wealthy....

if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation.

 But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about.
Changed:
<
<
Any information related to us, in our private or professional life, that we share online, constitutes our personal data. Those data are precious for companies because they reveal our preferences, our likes that can be sold to other companies to increase their sell.
>
>
Any information related to us, in our private or professional life, that we share online, constitutes our personal data.

Not necessarily. The information might not have anything to do with you at all. Once again, you might want to distinguish between the information distributed and the fact that you are either disseminating or receiving it. And you might be more attentive to the receiving activity rather than the transmitting activity, precisely because if you are, it will destabilize the structure of the current draft's argument.

Those data are precious for companies because they reveal our preferences, our likes that can be sold to other companies to increase their sell.

 Following this discovery, we can feel frustrated because it is something we were unaware of when we started using the internet or social media since no one explained it or talked to us about it.

One could argue it is easy to blame those companies and that it is our responsibility as individuals to react and do something. But by the time we heard about it, it was too late. We were already connected and dependent on social media, the internet. Our personal data were already used and sold everywhere in the world. Our discouragement gets bigger when we think about what to do to correct that, to make things change. One idea that emerged was to make the personal information that was gathered unlawfully disappear, be erased. One way to answer that problem was that states had to step in and protect our data.

Line: 30 to 44
 
Changed:
<
<

The EU is concerned with the right to be forgotten but doesn't address the real issue.

>
>

The EU is concerned with the right to be forgotten but doesn't address the real issue.

 

Line: 42 to 56
 In fact, the ECJ issued a ruling on September 25, 2019, regarding the implementation of the EU's right to be forgotten. This dispute involved Google and France's data authority (CNIL). The ruling held that CNIL can compel Google to remove links to offending material for users that are in Europe, but can't do that worldwide. We should not be shocked with the ECJ ruling, it was foreseeable since a different ruling could have been viewed as an attempt by Europe to police an US tech giant beyond US borders.
Added:
>
>
No. It upholds the right to police the US company beyond US borders. Perhaps you meant inside US borders. That is, to displace US law in the US. But the point is to deny all extraterritorial effects to EU orders coercively implementing the duty to forget or censoring indexes through deindexing orders. As I emphasized in class, SFLC.in, the Indian sister organization to my own Software Freedom Law Center, intervened in the EJC against CNIL, from a purely non-US perspective.

 When we read through the decision, we see another underlying problem: how can right to privacy and freedom of information work together. France asked the Court to extend the right to be forgotten universally to people outside the EU. Google argued that such a ruling may result in global censorship and infringement of freedom of information rights. The real issue at stake is: can we impose on other a duty to forget?
Line: 58 to 76
 One other argument would be that allowing people to have some links related to their name delated could become a way to hide the truth and give a false picture of who they are. Imposing on people that are seeking information on other a duty to forget on some other's information is not the good answer. Individuals have a right to access all the information available, and past mistakes should not be forgotten, but used as examples. Maybe one answer to the problem would be to allow a right of correction, to reply that restricts less the freedom of expression, compares to the right to be forgotten. It would enable individuals to present themselves as they really are and correct false information about them.
Changed:
<
<
Finally, instead of using a right to be forgotten that does more harm to freedom of expression, we could use other remedies such as going to court which will decide if the information will remain available to the public society or not. We could also use mechanisms available on social media platforms that helps to identify harmful content and then can be removed by those platforms.
>
>
Finally, instead of using a right to be forgotten that does more harm to freedom of expression, we could use other remedies such as going to court which will decide if the information will remain available to the public society or not.

This is still judicial censorship, is it not?

We could also use mechanisms available on social media platforms that helps to identify harmful content and then can be removed by those platforms.

 

Added:
>
>

The draft does a good job explaining existing European legal phenomena. It also clearly voices the basic free expression arguments that limit the use of censorship orders in the interest of privacy. But the draft lacks a clear idea of your own to be placed in dialogue with these exterior sets of ideas, which is why the conclusion find you basically throwing up your hands. Making the draft stronger means bringing your ideas to the front. You can say what you think, show how your idea emerges from your understanding of the two other points of view, then providing some real conclusion on which the reader can then base her own further thinking.

As I said in the early lines of the draft, by abstracting from the narrow "right to be forgotten" fact pattern, you can make progress. Censorship orders seem useful, whatever their problems in principal, only so far as we concern ourselves with searching: in the end, it's about the state's right to censor indexes on behalf of individuals. But if you start from the role of the same parties in surveiling what everyone reads, it is rapidly apparent that censoring indexes, or indeed censoring content directly, will not address the center of the problem at all.

Once it is apparent that the orders censoring indexes are a minor response to an infinitesimal part of the problem, one can no longer cast the issue as "right to be forgotten" against "free expression." That makes much more room for your own ideas.

 



CarlaDULACFirstEssay 2 - 11 Oct 2019 - Main.CarlaDULAC
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Changed:
<
<

Paper Title

>
>

Privacy, Protection of our data: the right to be forgotten online versus freedom of information

 -- By CarlaDULAC - 06 Oct 2019
Changed:
<
<

Section I

>
>

My worries about having no say in the use of my personal data.

If we were to ask people from our generation if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation. But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about.

Any information related to us, in our private or professional life, that we share online, constitutes our personal data. Those data are precious for companies because they reveal our preferences, our likes that can be sold to other companies to increase their sell. Following this discovery, we can feel frustrated because it is something we were unaware of when we started using the internet or social media since no one explained it or talked to us about it.

One could argue it is easy to blame those companies and that it is our responsibility as individuals to react and do something. But by the time we heard about it, it was too late. We were already connected and dependent on social media, the internet. Our personal data were already used and sold everywhere in the world. Our discouragement gets bigger when we think about what to do to correct that, to make things change. One idea that emerged was to make the personal information that was gathered unlawfully disappear, be erased. One way to answer that problem was that states had to step in and protect our data.

The EU is concerned with the right to be forgotten but doesn't address the real issue.

 
Changed:
<
<

Subsection A

>
>
The right to be forgotten entered the European Union privacy sphere with the 2014 judgement of the ECJ involving Google. It enabled people to demand the removal of links to information that are inaccurate, inadequate, irrelevant or excessive. This is called right to be forgotten. Then, it was implemented under article 17 of the General Data Protection Regulation (GDPR). The right to erasure grants data subjects a possibility to have their personal data deleted if they don't want them used anymore and when there is no legitimate reason for a data controller to keep it. It seemed a good way to act posteriorly and find a way to correct our past mistakes. But can we really rely on it? It did not take long before we could see the limit of this regulation.
 
Added:
>
>
In fact, the ECJ issued a ruling on September 25, 2019, regarding the implementation of the EU's right to be forgotten. This dispute involved Google and France's data authority (CNIL). The ruling held that CNIL can compel Google to remove links to offending material for users that are in Europe, but can't do that worldwide. We should not be shocked with the ECJ ruling, it was foreseeable since a different ruling could have been viewed as an attempt by Europe to police an US tech giant beyond US borders.
 
Changed:
<
<

Subsub 1

>
>
When we read through the decision, we see another underlying problem: how can right to privacy and freedom of information work together. France asked the Court to extend the right to be forgotten universally to people outside the EU. Google argued that such a ruling may result in global censorship and infringement of freedom of information rights. The real issue at stake is: can we impose on other a duty to forget?
 
Deleted:
<
<

Subsection B

 
Deleted:
<
<

Subsub 1

 
Added:
>
>

What about freedom of information. What else can we do?

 
Changed:
<
<

Subsub 2

>
>
As said before, one good aspect of GDPR is that it creates a fast process to erase the data that Internet companies collect and store for use in profiling and targeted advertising. But on other side it can be used to erase online content, whether or not that content actually violates anyone else's rights. Such a use could constitute a violation of freedom of information with is part of the fundamental right of freedom of expression. Those rights are recognised in international law, as in the article 19 of the Universal Declaration of Human rights. It is true that there are some very important concerns about data protection and privacy in face of mass collection of our data by companies, but the way "right to be forgotten" is built is not appropriate.
 
Added:
>
>
First of all, freedom of expression for the public interest is essential, even for information obtained unlawfully. In fact, information may not have been accessible otherwise because it was kept secret, but for the good of the society, they have been revealed. Allowing a right to forget for those data would be harmful for the society. One other argument would be that allowing people to have some links related to their name delated could become a way to hide the truth and give a false picture of who they are. Imposing on people that are seeking information on other a duty to forget on some other's information is not the good answer. Individuals have a right to access all the information available, and past mistakes should not be forgotten, but used as examples. Maybe one answer to the problem would be to allow a right of correction, to reply that restricts less the freedom of expression, compares to the right to be forgotten. It would enable individuals to present themselves as they really are and correct false information about them.
 
Added:
>
>
Finally, instead of using a right to be forgotten that does more harm to freedom of expression, we could use other remedies such as going to court which will decide if the information will remain available to the public society or not. We could also use mechanisms available on social media platforms that helps to identify harmful content and then can be removed by those platforms.
 
Deleted:
<
<

Section II

 
Changed:
<
<

Subsection A

>
>
 
Deleted:
<
<

Subsection B

 



CarlaDULACFirstEssay 1 - 06 Oct 2019 - Main.CarlaDULAC
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Paper Title

-- By CarlaDULAC - 06 Oct 2019

Section I

Subsection A

Subsub 1

Subsection B

Subsub 1

Subsub 2

Section II

Subsection A

Subsection B


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 6r6 - 22 Jan 2020 - 19:11:18 - CarlaDULAC
Revision 5r5 - 11 Jan 2020 - 16:05:59 - EbenMoglen
Revision 4r4 - 02 Dec 2019 - 15:37:59 - CarlaDULAC
Revision 3r3 - 24 Nov 2019 - 15:49:36 - EbenMoglen
Revision 2r2 - 11 Oct 2019 - 13:39:57 - CarlaDULAC
Revision 1r1 - 06 Oct 2019 - 14:34:19 - CarlaDULAC
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM