Computers, Privacy & the Constitution

View   r3  >  r2  ...
RamShchoryFirstPaper 3 - 29 Apr 2015 - Main.RamShchory
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"

Informed Consent to the Use of Private Data

Changed:
<
<
-- By RamShchory - 06 Mar 2015
>
>
-- By RamShchory
 
Changed:
<
<

The Problem – Websites and Apps’ Privacy Policies Do Not Actually Provide Either Notice or Choice

>
>

The Problem – Softwares’ Privacy Policies Do Not Actually Provide Either Notice or Choice

 

Notice and Choice

Changed:
<
<
The concept of notice and choice is the legal framework designed to assure that the user of an app or a website (“a software”) is aware of- and agrees to the fact that she is supplying private information to the operator of the software and to the uses being done with this information.
>
>
The concept of notice and choice is the legal framework designed to assure that the user of an app or a website (“a software”) is aware of- and agrees to the collection and use of private information by the operator of the software. The user receives notification of these uses through a privacy policy, and is directly choosing by a trivial click on an “I agree” button, or implicitly choosing by merely continuing to use the website.
 
Changed:
<
<
As a matter of practice, the user receives notification of these uses – containing possible privacy infringements – through a privacy policy, and is directly choosing by a trivial click on an “I agree” button, or implicitly choosing by merely continuing to use the website.
>
>
However, practically, neither notice nor choice are given. Privacy policies may contain the relevant information, but they are not understandable. To illustrate – estimations are that an average user will have to dedicate a month each year to read all the lengthy privacy policies she encounters, and in fact few do. Moreover, often people mistakenly think that the existence of a privacy policy means the operator has to keep their information private. Finally, because these are complex legal documents many users simply lack the tools to comprehend them (and sometimes they are simply in a different language).
 
Changed:
<
<
However, it seems that nevertheless no notice can be found, and, consequentially, neither a choice. No notice can be found because although privacy policies may contain the relevant information, they are not practically understandable. To illustrate – estimations are that an average user will have to dedicate a month each year to read all the lengthy privacy policies she encounters, and in fact few do. Moreover, often people mistakenly think that the existence of a privacy policy means the operator has to keep their information private, but this is clearly not the case. Finally, because these are complex legal documents many users simply lack the tools to comprehend them (and sometimes they are simply in a different language).

Analogizing to tort law informed consent principles, when one cannot understand what it is one gives one’s consent to, that is, when one is not properly informed, the consent is an artificial agreement. How can you agree to something – all the more so about something that can potentially hurt you – when you do not understand what that thing is?

>
>
Analogizing to tort law informed consent principles, without understanding, the user is not properly informed and the consent is an artificial agreement to an obscure or falsely perceived concept.
 

The Dangers

Changed:
<
<
The obvious solution to this problem, is, of course, making the notice understandable. However, the mere understanding of the information provided in the privacy policies, and the consent provided to the operator to use the information are not enough, as the user must also truly understand the potential consequences and the danger that may occur due to that infringement (approved or otherwise) of her privacy.

A Possible Solution – Warning the User of the Possible Dangers

>
>
Allegedly, we can simply make the notice understandable. However, that alone will not provide true consent, as the user must also truly understand the potential consequences and the danger that may occur due to that infringement (approved or otherwise) of her privacy.
 
Changed:
<
<
I can think of two possible ways to both notify a user of the things people can do with her information, and to assure that she is also aware of the dangers.
>
>

Warning the User of the Possible Dangers

 
Changed:
<
<

The Prospectus Way

>
>
A possible solution will be to adopt warning methods used in other legal contexts.
 
Changed:
<
<
A first approach will be to use warning methods adopted in the securities context, that is, a declaration by the company of the risk factors an investor – or in our case a user – might incur if she chooses to invest/use. This “prospectus” way of warning differs from today’s privacy policies because it requires the operator to directly refer to risks. However, it seems to be insufficient to our cause, as it will suffer from the same understanding shortcomings, that is, being too long and in a professional language.
>
>

Prospectus Warning

 
Changed:
<
<

The Cigarette Way

>
>
In the securities context a company declares the risks an investor might incur if she chooses to invest. This “prospectus” warning differs from today’s privacy policies because it requires the operator to directly refer to risks. However, it seems to be insufficient to our cause, as it will suffer from the same understanding shortcomings, that is, being too long and in a professional language.
 
Deleted:
<
<
A different way is to borrow the plain packaging warning approach that was designed to warn cigarette consumers of the product’s health hazards, that is, demand a clear, blunt, straightforward and even frightening warning of what might happen and what damages might be caused if the user indeed chooses to agree to the terms of the privacy policy. If one truly believes that a real danger lies beneath giving away private information, why not shout it from the rooftops? Why not give a clear warning of what might happen, who might do what, and let the user decide if she wants to take the risk, for the rewards it will bring?
 
Changed:
<
<

Analysis of the Cigarette Solution

>
>

Cigarette Warning

 
Changed:
<
<
Although we might expect this kind of solution to deal with the unawareness problem efficiently, it has substantial costs. First, such blunt warning might over-deter people from using economically efficient software. Even if we ignore the potential damage to the software’s owners and the political difficulties they may cause in their reluctance to go along with such a solution (which cannot by themselves justify ruling out a desired regulation), if people will be too afraid to use the internet, we might be harming desirable technological development. Second, it is unclear how such a warning will be designed and articulated to users, as the nature of the potential damage is less clear than that of cigarettes. Finally, such warning might be perceived as an overreaction and be taken lightly, thus achieving the very opposite of what we tried to accomplish. People often don’t see the potential harms in privacy breach as a problem (at least not their own problem), and depicting it bluntly as a very serious one may be creating antagonism and reluctance to get the bottom of the dangers.
>
>
A different way is to borrow the plain packaging warning approach used to warn cigarette consumers of the product’s health hazards, that is, demand a clear, blunt, straightforward and even frightening warning of the damages of agreeing to agree to the terms of the privacy policy.
 
Changed:
<
<
So what can we do? I suggest shifting the focus of notice and choice to a place that will not attempt to convey the whole message – including both immense possibilities and serious dangers – to every user in every software. Rather, I think that a simple, clear and thought provoking message will be much more efficient. If a website or an app will be required to convey a simple message, it could stimulate the user to give a second thought to the infringement of her privacy, and the potential damages that can come with waiving it. By raising the consumers’ awareness, a solution of that sort can inspire thought about this subject, and support the rise of an important public debate.
>
>
However, this solution also suffers from major shortcomings. First, such blunt warning might over-deter people from using economically efficient software; even if we ignore the potential damage to the software’s owners and the political difficulties they may cause in their opposition to this solution. Second, it is unclear how such a warning will be designed, as the nature of the potential damage is less clear than that of cigarettes. Finally, because of people’s current unawareness, such warning might be perceived as an overreaction and be taken lightly, thus achieving the very opposite of what we tried to accomplish.
 
Changed:
<
<
It is unclear whether then users will choose to pressure on the software operators, demand a better protection of their privacy, or even demand a prohibition of the use of private data; or will they just ignore the danger and continue their everyday use, benefiting from what technology offers. But as long as people get to think, discuss and debate this important issue, we are already in a better place than the one we are in today, when everybody simply click “I agree”.
>
>

Stepping Away From Regulation

 
Added:
>
>
It seems that the implementation of the aforementioned models will be lacking at best. An entirely different approach is to provide a technical solution, not based on regulation.
 
Changed:
<
<
>
>

Technical Solutions

 
Changed:
<
<
You might have considered some other solutions. You could look at Terms of Service: Didn't Read, for example. Approaches that depend on state regulation are no better than one state at a time. But what can be done without depending on regulation through technical and shared social means will be everywhere at once.
>
>
One type of technical solutions is a simple, user-oriented and understandable review by a third party of a software’s privacy policy. ToS;DR – Terms of Service: Didn’t Read is a great example for such a solution. ToS? ;DR is a website rating different sites’ terms of service, providing a rank on a scale of A-E, and a brief summary of the most significant terms.
 
Changed:
<
<
There is also no reason why entities cannot emit their privacy policies in a standard machine-readable format, allowing user agents (browsers and other web clients for people) to interpret those policies in user-focused terms, including by offering users an interface to accept or reject sites' and services' offers to them based on the policies that accompany those offers.
>
>
Another type of solution is standardizing and simplifying the data in the privacy policy, making it accessible to the end user, with or without an intervention of the user’s agent to access the information. CommonTerms is one example, offering a simple platform to be adopted by a software operator to preview the terms of the privacy policy in accessible and understandable way before accepting them. Privacy Icons is another example, attempting to create a set of icons to convey different privacy related massages.
 
Changed:
<
<
Failure to consider technical as well as political and legal solutions to problems will, for reasons I have suggested, lead to incomplete analysis in every case. The best way to improve this draft, in my opinion, is to strengthen its technical side, which will result in the compression of some arguments and the removal of others from the present draft.
>
>

Analysis

 
Added:
>
>
The technical solutions have significant advantages. First, they depict an understandable and accessible picture of the privacy concerns. Second, they do not depend on regulation, making them uniform across the world and free from ever changing political concerns.
 
Changed:
<
<
>
>
Nevertheless, they are not free of difficulties. First, these solutions are somewhat simplistic, in the sense that they narrow down vast legal issues to simple sentences. Second, the standardizing solutions depend on the cooperation of the software operator (to put the icons or to generate the preview), who has a completely different set of interests. Third, the expansion of the solutions may be slow (for example, ToS? ;DR is operating for nearly three years, and fully rated 12 sites). Finally, in their present state the technical solutions are not conveying the necessary sense of urgency about the dangers posed by privacy breaches. Put differently, they might be providing notice, but that does not result in fully informed users able to choose.
 
Added:
>
>
So what can we do? I suggest to develop the technical solutions, by adding a sense of danger to the current understandable messages, that is, by making it clear not only what a specific term means, but how can it be dangerous to the user. For example, when ToS? ;DR says Facebook has “very broad copyright license on your content”, it can add that “that may lead to Facebook owning your creations without additional consent”; or when saying that Google “can use your content for all their existing and future services”, it can add “that today we are unaware of, and may be destructive to your privacy”. This approach will create a simple, clear and thought provoking message about privacy. It may fail to lead to a revolution, but as long as people think, discuss and debate this important issue, we are already in a better place than the one we are in today, when everybody simply click “I agree”.
 
\ No newline at end of file

Revision 3r3 - 29 Apr 2015 - 16:33:29 - RamShchory
Revision 2r2 - 27 Apr 2015 - 22:03:49 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM