Law in the Internet Society

View   r3  >  r2  >  r1
AndrewIwanickiFirstEssay 3 - 31 Mar 2020 - Main.AndrewIwanicki
Added:
>
>
Revision 3 is unreadable
Deleted:
<
<
META TOPICPARENT name="FirstEssay"

The Internet Society’s Nuclear Option

-- By AndrewIwanicki - 12 Oct 2019

Section I

In class, we have discussed the importance of privacy and the risks of surveillance in an era of increasingly sophisticated behavior recording, prediction, and manipulation. As a society, we are becoming increasingly entrenched in a burgeoning ecosystem of surveillance capitalism.

Many agree that a fundamental redirect is in order; the broadly unregulated, widespread capture of behavioral data should be restricted or even prohibited worldwide. Ideally, we might even eliminate all previously collected behavioral information.

However, as I reflect upon the current state of the Internet Society, I cannot ignore the nonzero possibility that the war to preserve the privacy of behavioral data and prevent sophisticated behavioral influence has already been lost.

Within Google’s servers alone lay my proudest academic works, intimate secrets from my darkest moments, my tasks for the day, my plans for the year, a scatterplot of my social footprint, an extensive record of my movements, and contact information for every human I know. Facebook, Amazon, and Bank of America hold powerful data profiles of me as well. Add to that datasets compiled by the U.S. government and other state entities.

I write this as a relatively well informed, well educated, and concerned citizen. My dismal tale of ignorant surrender and subsequent inaction is all too common. Around the globe various corporate and government entities hold massive troves of personal information regarding billions of humans.

Section II

Unfortunately, the deletion of this behavioral data strikes me as a functional impossibility. Such valuable digital information will not be destroyed by force. Considering the power of the parties who hold it and the existential threat that deletion would present, they will not cooperate either. We must also consider the general lack of support for such action at this time and the logistical difficulties inherent in such an effort. Accordingly, I assume that the behavioral data that has been collected will remain indefinitely.

Next, I consider the possibility that we can limit the capture of behavioral data to its present state.

Even if I completely unplug today, I have already leaked extensive information. The power of this data in combination with present-day tools is evident in societal changes as fundamental as declining sex drive and the swaying of national elections.

With such immense value, behavioral-data-driven tools will continue to advance even in the absence of new data collection.

The best-case scenario appears to be an incremental slowdown of behavioral data collection over several years with significant dissent by parties that are unmoved by widespread concern and have sufficient leverage to withstand external pressures (e.g. Communist Party of China).

Section III

Considering these dynamics, I am concerned that a data-collection slowdown may be insufficient to eliminate threats of social control. Accordingly, it seems prudent to consider an alternate plan of action in case of continued progression into a surveillance-centric ecosystem.

Society’s current path is one in which the Parasite with the Mind of God is under construction…or simply undergoing perpetual renovations. Theorists such as Ray Kurzweil and Nick Bostrom believe that society is en route to creating superintelligent artificial intelligence, a digital system that is capable of outperforming humanity in all intellectual endeavors. Such a machine strikes me as the natural conclusion of a society in a feedback loop of data capture for observation, analysis, and influence.

Bostrom further claims that superintelligent A.I. “is the last invention that man need ever make” as it may execute any further self-enhancements and will be sufficiently intelligent to thwart attempts at intervention.

If we continue on this path, we must decide who should be in control of this ultimate project and what procedures will guide the decision-making process.

At present, the frontrunners in the race for big data and sophisticated machine learning seem to be Big Tech and national governments. Neither group embodies the goals or procedures that I want guiding such a project of ultimate importance.

Both are shrouded in secrecy and exist within competitive spaces that cultivate racing behavior. “Move fast and break stuff.” “It’s better to ask for forgiveness that to request permission.” As these tools become more powerful and the societal impact more drastic, such behavior becomes increasingly dangerous.

To avoid a future shaped by today’s likely candidates and their inherent flaws, I advocate the establishment of a socialized multinational AI research project that is subject to public input and oversight and is less constrained by capitalist and political forces. A unified global public project strikes me as the best opportunity to cultivate sufficient resources to surpass the efforts of Big Tech and national governments.

Even if such a project is initiated immediately, the hour is late and the competition is fierce. Thus, drastic action must be considered. Legislation granting data portability rights could be extremely helpful, allowing individuals to obtain their personal data from service providers and, in turn, share that information with the socialized project. Similarly, legislation that protects adversarial interoperability in the software industry could catalyze transitions away from predatory products upon which the public has become dependent. If necessary to achieve competitive dominance, further data collection on a consensual basis may be pursued.

Section IV

While the collection and processing of behavioral information is inherently risky, an international socialized model may greatly reduce the risks of our present private and national models.

I do not advocate any surrender in the fight for privacy. I simply support the development of contingency plans. An arms race is afoot in both the private and public sector with many convinced that surveillance is the key to future dominance. In humanity’s failure to denuclearize, I see an inability of modern society to relinquish powerful tools of control and I fear that digital surveillance may be similarly destined to proliferate.

You might consider some more descriptive section titles.

The "final invention" "singularity" hype benefits from apparent inevitability: even though we don't understand how creative intelligence works, our sham versions based on pattern matching are getting better and better. We can now train a computer to play go so well that we think we don't need to concern ourselves with the fact that our go-playing program doesn't know that go is a game or that it is playing. If super-intelligence can be made this dumb, we can make it eventually. If not, we are as far from even being able to begin as we were fifty years ago when the smartest people in the world started trying to sell this stuff and I began my lifelong refusal to believe a word of it.

But what the hype has done is convince national leaders around the world. So the geopolitics of the situation now render your idea of multinational socialization of behavior data as impossible as (you rightly say) the destruction of existing data is. Your nuclear solution is indeed caught up in a similar arms race, and is similarly out of reach. So you have no Plan B to go with the absence of Plan A.

A decision then looms about the next draft. You can pursue the idea you are on, and enter into how the geopolitics are to be reformed so that such cooperation is possible (Atoms for Peace, perchance?) Or you can return to the problem of what to do about surveillance capitalism if the existing stocks of capital are not to be destroyed. Here there is a real question, doubtless. We'd have to make sure that the next generation was differently taught, and was less pillaged, so that pollution doesn't become hereditary. At present, of course, we are doing the reverse. We'd have to institute competition in the market for services, offering low-cost ways of delivering similar services to people without the spying, allowing market forces to reduce the value of the behavior collection system while offering more ways for existing people to leave and new people never to tumble in. I have some ideas about that, and you could offer better ones. At least that would not involve the deus ex machina of super-intelligence without self-knowledge.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


AndrewIwanickiFirstEssay 2 - 01 Dec 2019 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
 
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 

The Internet Society’s Nuclear Option

Line: 60 to 58
 I do not advocate any surrender in the fight for privacy. I simply support the development of contingency plans. An arms race is afoot in both the private and public sector with many convinced that surveillance is the key to future dominance. In humanity’s failure to denuclearize, I see an inability of modern society to relinquish powerful tools of control and I fear that digital surveillance may be similarly destined to proliferate.
Added:
>
>
You might consider some more descriptive section titles.

The "final invention" "singularity" hype benefits from apparent inevitability: even though we don't understand how creative intelligence works, our sham versions based on pattern matching are getting better and better. We can now train a computer to play go so well that we think we don't need to concern ourselves with the fact that our go-playing program doesn't know that go is a game or that it is playing. If super-intelligence can be made this dumb, we can make it eventually. If not, we are as far from even being able to begin as we were fifty years ago when the smartest people in the world started trying to sell this stuff and I began my lifelong refusal to believe a word of it.

But what the hype has done is convince national leaders around the world. So the geopolitics of the situation now render your idea of multinational socialization of behavior data as impossible as (you rightly say) the destruction of existing data is. Your nuclear solution is indeed caught up in a similar arms race, and is similarly out of reach. So you have no Plan B to go with the absence of Plan A.

A decision then looms about the next draft. You can pursue the idea you are on, and enter into how the geopolitics are to be reformed so that such cooperation is possible (Atoms for Peace, perchance?) Or you can return to the problem of what to do about surveillance capitalism if the existing stocks of capital are not to be destroyed. Here there is a real question, doubtless. We'd have to make sure that the next generation was differently taught, and was less pillaged, so that pollution doesn't become hereditary. At present, of course, we are doing the reverse. We'd have to institute competition in the market for services, offering low-cost ways of delivering similar services to people without the spying, allowing market forces to reduce the value of the behavior collection system while offering more ways for existing people to leave and new people never to tumble in. I have some ideas about that, and you could offer better ones. At least that would not involve the deus ex machina of super-intelligence without self-knowledge.

 



AndrewIwanickiFirstEssay 1 - 12 Oct 2019 - Main.AndrewIwanicki
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

The Internet Society’s Nuclear Option

-- By AndrewIwanicki - 12 Oct 2019

Section I

In class, we have discussed the importance of privacy and the risks of surveillance in an era of increasingly sophisticated behavior recording, prediction, and manipulation. As a society, we are becoming increasingly entrenched in a burgeoning ecosystem of surveillance capitalism.

Many agree that a fundamental redirect is in order; the broadly unregulated, widespread capture of behavioral data should be restricted or even prohibited worldwide. Ideally, we might even eliminate all previously collected behavioral information.

However, as I reflect upon the current state of the Internet Society, I cannot ignore the nonzero possibility that the war to preserve the privacy of behavioral data and prevent sophisticated behavioral influence has already been lost.

Within Google’s servers alone lay my proudest academic works, intimate secrets from my darkest moments, my tasks for the day, my plans for the year, a scatterplot of my social footprint, an extensive record of my movements, and contact information for every human I know. Facebook, Amazon, and Bank of America hold powerful data profiles of me as well. Add to that datasets compiled by the U.S. government and other state entities.

I write this as a relatively well informed, well educated, and concerned citizen. My dismal tale of ignorant surrender and subsequent inaction is all too common. Around the globe various corporate and government entities hold massive troves of personal information regarding billions of humans.

Section II

Unfortunately, the deletion of this behavioral data strikes me as a functional impossibility. Such valuable digital information will not be destroyed by force. Considering the power of the parties who hold it and the existential threat that deletion would present, they will not cooperate either. We must also consider the general lack of support for such action at this time and the logistical difficulties inherent in such an effort. Accordingly, I assume that the behavioral data that has been collected will remain indefinitely.

Next, I consider the possibility that we can limit the capture of behavioral data to its present state.

Even if I completely unplug today, I have already leaked extensive information. The power of this data in combination with present-day tools is evident in societal changes as fundamental as declining sex drive and the swaying of national elections.

With such immense value, behavioral-data-driven tools will continue to advance even in the absence of new data collection.

The best-case scenario appears to be an incremental slowdown of behavioral data collection over several years with significant dissent by parties that are unmoved by widespread concern and have sufficient leverage to withstand external pressures (e.g. Communist Party of China).

Section III

Considering these dynamics, I am concerned that a data-collection slowdown may be insufficient to eliminate threats of social control. Accordingly, it seems prudent to consider an alternate plan of action in case of continued progression into a surveillance-centric ecosystem.

Society’s current path is one in which the Parasite with the Mind of God is under construction…or simply undergoing perpetual renovations. Theorists such as Ray Kurzweil and Nick Bostrom believe that society is en route to creating superintelligent artificial intelligence, a digital system that is capable of outperforming humanity in all intellectual endeavors. Such a machine strikes me as the natural conclusion of a society in a feedback loop of data capture for observation, analysis, and influence.

Bostrom further claims that superintelligent A.I. “is the last invention that man need ever make” as it may execute any further self-enhancements and will be sufficiently intelligent to thwart attempts at intervention.

If we continue on this path, we must decide who should be in control of this ultimate project and what procedures will guide the decision-making process.

At present, the frontrunners in the race for big data and sophisticated machine learning seem to be Big Tech and national governments. Neither group embodies the goals or procedures that I want guiding such a project of ultimate importance.

Both are shrouded in secrecy and exist within competitive spaces that cultivate racing behavior. “Move fast and break stuff.” “It’s better to ask for forgiveness that to request permission.” As these tools become more powerful and the societal impact more drastic, such behavior becomes increasingly dangerous.

To avoid a future shaped by today’s likely candidates and their inherent flaws, I advocate the establishment of a socialized multinational AI research project that is subject to public input and oversight and is less constrained by capitalist and political forces. A unified global public project strikes me as the best opportunity to cultivate sufficient resources to surpass the efforts of Big Tech and national governments.

Even if such a project is initiated immediately, the hour is late and the competition is fierce. Thus, drastic action must be considered. Legislation granting data portability rights could be extremely helpful, allowing individuals to obtain their personal data from service providers and, in turn, share that information with the socialized project. Similarly, legislation that protects adversarial interoperability in the software industry could catalyze transitions away from predatory products upon which the public has become dependent. If necessary to achieve competitive dominance, further data collection on a consensual basis may be pursued.

Section IV

While the collection and processing of behavioral information is inherently risky, an international socialized model may greatly reduce the risks of our present private and national models.

I do not advocate any surrender in the fight for privacy. I simply support the development of contingency plans. An arms race is afoot in both the private and public sector with many convinced that surveillance is the key to future dominance. In humanity’s failure to denuclearize, I see an inability of modern society to relinquish powerful tools of control and I fear that digital surveillance may be similarly destined to proliferate.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 3r3 - 31 Mar 2020 - 16:10:42 - AndrewIwanicki
Revision 2r2 - 01 Dec 2019 - 14:10:15 - EbenMoglen
Revision 1r1 - 12 Oct 2019 - 00:18:25 - AndrewIwanicki
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM