Computers, Privacy & the Constitution
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Fluoride and Facebook: Comparing Privacy Regulation to Environmental Regulation, and How a Better Standard can be Created.

-- By AnthonyMahmud - 26 Apr 2021

Introduction

On multiple occasions this semester, class discussion has focused on why consent requirements alone are not good enough at protecting the privacy interests of a population with widely varying understandings of personal data and how to manage it. Like with the regulation of air, roads, or water, data privacy should need to establish baseline standards of safety--even if some population segments would opt-out if given the choice. Society should ensure that its members can breathe fresh air and drink clean water regardless of their awareness of contaminants or capacity to purchase filtration systems. However, identifying the appropriate level of protection is not always a simple task: any single standard is likely to underregulate in some circumstances, and overregulate in others. This tradeoff is more difficult to balance when the uses of a given utility are more varied. For example, the water that runs through residential plumbing systems contains fluoride and is regulated for human consumption. The same water, when used to flush toilets, is cleaner and perhaps more expensive than society needs it to be. On the other end, tap water is ill suited for certain niche uses, ranging from CPAP machines and first aid to automobiles and aquariums. These inefficiencies and limits of quality are not necessarily flaws in policy, but necessary compromises of a single-source distribution infrastructure.

The Context-based Nature of Digital privacy Harms

Digital privacy regulation does not share this problem, so a one-size-fits-all approach is not necessary even though certain minimum standards need to be concretely established. The challenge with establishing these standards is that the malice of a given data collection practice is highly dependent on the context in which it occurs. Access to a device’s location data, camera, or microphone are among the most intrusive permissions to personal privacy, but they are necessary inputs when using a device as a GPS or video calling apparatus. Conversely, predatory applications often require sensitive information even though those inputs are not used in the service being provided. Games and mobile apps targeting children are notorious for needlessly requiring location data and extensive behavior monitoring.

An effective privacy regulation would allow GPS software to use necessary location data, but prohibit nonessential uses that merely seek to profit from mined data. GDPR Article 7 contains language that demonstrates how such an effect might be achieved. Article 7 deals with conditions for consent, and it stipulates that “when assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.” This provision reflects skepticism that a person, if properly informed, would consent to provide data where it is not needed for primary functionality. It also appears to work against privacy policies that combine essential and non-essential data collecting practices into a single, inseparable consent requirement. GDPR ultimately applies this framework for assessing consent, but the underlying mechanism can be applied to baseline standards of permissibility.

Attempting to Categorize and Regulate Harms through an Environmental Framework

An environmentalesque regulation of digital privacy might break collection practices into three categories: globally permissible, globally impermissible, and those which, (drawing front he GDPR language,) must be narrowly tailored to the functionality for the user. The globally impermissible category would capture conduct akin to the compounds and concentrations prohibited in regulation of water or air: conduct that should be barred even if select legitimate purposes can be conceived. Facial recognition might fit here, as the capacity for abuse almost always outweighs the practical benefit. Globally permissible conduct might be something like the fluoride treatment in public water supplies. Collection of imprecise location data (such as one’s country or timezone) has many legitimate applications and a relatively low capacity for abuse. In those situations where one needs complete anonymity, (where they need distilled water,) they are more likely to recognize the situation themselves and take the initiative to use more caution.

Lastly, when collection practices have both significant capacity for abuse, and practical utility, the burden would shift to the digital service provider to take only what they need in order to execute the functionality being provided. This, of course, necessarily eliminates data collection practices that are essential to funding or developing the service. Precise location data, data from camera/microphone access, or cursor-tracking logs could not be collected in order to sell targeted advertisements or optimize user interfaces. While this effect would hamper some good-faith research and undermine the financial viability of certain service offerings, it reflects the importance of protecting everybody’s fundamental privacy interests.

Concerns and Other Considerations

For a system like this to be effective, the regulation would need to have clear boundaries for when a practice is necessary for executing functionality, and a strong enforcement mechanism for punishing nonessential collection. Otherwise, digital services have an incentive to imbue ancillary functionalities solely to merit broader data collection. It is difficult to prevent this behavior without placing more conduct in the ‘globally impermissible’ category, or relying on consent agreements. One solution might be to limit the usage of collected data to the narrow circumstances under which it can be permissibly collected. Selling advertisements would not conceivably relate to executing the functionality of a service, so data collected for a legitimate reason could not be adapted to that purpose.

Regulating digital privacy with an ‘environmental’ approach is undoubtedly a challenging balancing act. However, leaning on the side of protection shifts the responsibility onto parties better-equipped to handle it (service providers), and reinforces the status of individual privacy rights as fundamental.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r1 - 26 Apr 2021 - 18:38:44 - AnthonyMahmud
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM