Law in the Internet Society

View   r3  >  r2  >  r1
EricaPedersenFirstEssay 3 - 05 Jan 2020 - Main.EricaPedersen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 7 to 7
 -- By EricaPedersen - 11 Oct 2019
Changed:
<
<

Defining Privacy for the Internet Age: Secrecy vs. Autonomy

>
>

Privacy and Autonomy in the Internet Age: Beyond the Individual

 
Changed:
<
<
Policymakers in the United States are finally beginning to acknowledge the tech industry’s exploitation of users through extensive monitoring and data collection. Unfortunately, the current policy debate is narrowly focused on enshrining individual rights to notice and consent with respect to the collection and sale of their personal data (see e.g., California Consumer Privacy Act). These ‘protections’ belie legislators’ superficial understanding of privacy as the right to secrecy, an individual’s right to hide information from certain actors. This framing will ultimately undercut society’s ability to address the significant threats that big data pose to individual freedom, autonomy, and self-determination.
>
>
Policymakers in the United States are finally beginning to acknowledge the tech industry’s exploitation of users through extensive monitoring and data collection. Unfortunately, the current policy debate is narrowly focused on enshrining limited individual rights to notice and consent with respect to the sale of personal data and the right to request deletion of personal data (see e.g., California Consumer Privacy Act which also includes broad carveouts allowing data collectors and third party “service providers” to ignore consumers’ deletion requests altogether).
 
Changed:
<
<
Not really secrecy, because their attention is also directed at the personal information people voluntarily give to information processors, not the messages they intend to keep secret.
>
>
These trivial ‘protections’ belie legislators’ erroneous belief that privacy will be sufficiently maintained through deletion rights and businesses’ feigned attempts to dissociate personally identifiable information from non-PII data. However, merely preventing one’s own data from entering a particular data pool will not protect individuals from privacy intrusions derived from algorithmic profiling or discriminatory limitations on freedom of choice. Laws which so narrowly frame the harms of data privacy exploitation will ultimately undermine our ability to conceptualize and address the significant environmental threats that big data pose to individual and societal freedom, autonomy, and self-determination.
 
Changed:
<
<
The fundamental importance of a right to privacy lies not in its facilitation of secrecy, but in its protection of individual autonomy and self-determination. In the seminal article considered to be the original source of American legal conceptions of a right to privacy, Warren and Brandeis emphasized that privacy “is in reality not the principle of private property, but that of an inviolate personality.”
>
>
The fundamental importance of a right to privacy lies in the necessity of protecting the fundamental human rights of individual autonomy and self-determination. Preservation of these rights is integral to both freedom and functional democracy. By focusing on a superficial individual right to anonymity, modern American lawmakers obscure and insulate the broader societal harms inherent to current methods of monetizing personal data.
 
Changed:
<
<
But writing from 1896 probably doesn't fully deal with the current situation. And that's a piece about the tort of invading privacy, which is really distinct, is it not?
>
>

Potential Harms of Data Analytics and Algorithmic Profiling

 
Added:
>
>
Algorithms provide an economically efficient means of analyzing huge data sets to gain new insights based on complex statistical correlations. As such, algorithmic methodologies are enormously powerful research tools. The Cambridge Analytica leaks have shown the power of data analytics to conduct mass psychological warfare and manipulate human behavior on a global scale. Moreover, algorithmic profiling has increasingly emerged as a gatekeeper controlling individuals’ abilities to access to a wide array of choices and opportunities in the real world.
 
Added:
>
>
Businesses seeking new ways to minimize costs have been quick to capitalize on algorithms’ predictive capabilities. Consequently, algorithms trained to maximize organizational welfare are increasingly used to infer, predict, and shape individuals’ personal preferences, interests, behavior, attitudes, movements, or health. Companies use algorithmic profiling to target and identify “ideal” job applicants, as well as to make training, compensation, promotion, and termination decisions about current employees (over whom management exercises increasingly broad and intrusive surveillance rights). Judges’ sentencing decisions are guided by insights from privately-developed and largely unaccountable predictive algorithms, despite the likelihood that these tools are perpetuating the same systemically discriminatory outcomes that plague our criminal justice system. Other applications include housing and policing.
 
Changed:
<
<
Preservation of individual autonomy is integral to both human freedom and functional democracy. Cambridge Analytica has proven the power of data analytics to conduct mass "psychological warfare" and manipulate human behavior to engineer a systemically destabilizing political outcome.

That's rather a strong version of what was "proved." Perhaps a less aggressive statement of what we have learned would still be sufficient?

Moreover, algorithmic profiling has increasingly emerged as a gatekeeper controlling individuals’ abilities to access to a wide array of choices and opportunities in the real world. For the most part, these processes are invisible to those whose freedoms they have restricted. Affected individuals typically have no right to notice, consent, explanation, nor any ability to effectively dispute the applicability, efficacy, or disparate impact of the methodology.

By framing the right to privacy as a right to secrecy, modern American lawmakers obscure and insulate the broader individual and societal harms inherent to current methods of monetizing personal data. This unfortunate fact is particularly obvious when one considers the effect that legislation like the CCPA would have on the thriving industry of predictive algorithms. Data collection is only one step in the development of the algorithms which now guide (or supplant) human decision-making in many areas of our lives. We cannot preserve individual autonomy without addressing the ways in which data are processed and the applications of these statistical inferences in the real world.

I don't think that explaining this in terms of over-protection of secrecy per se works very well, given that the statutes and your factual examples aren't about secrecy. In the latter portion of the paragraph, you appear to be arguing that a statute is faulty if it tries to solve one problem but doesn't solve another. That an attempted solution is partial is not evidence that it is faulty.

Potential Harms of Data Analytics and Algorithmic Profiling

Algorithms are an economically efficient way to analyze huge data sets and gain new insights based on complex statistical correlations. Algorithms are frequently used to infer or predict an individual’s personal preferences, interests, behavior, attitudes, movements, or health. The Working Party on the Protection of Individuals with Regard to the Processing of Personal Data defines profiling as the “automated processing of personal data for evaluating personal aspects, in particular to analyse or make predictions about individuals.” Although the GDPR attempts to restrict automated decision-making and decision-making based on profiling, the United States has been much slower to acknowledge such autonomy-compromising applications of technology.

"Algorithms" are computer programs. Running computer programs in order to draw inferences from data is thinking. Scientific thinking, in particular, cannot proceed in any other way in many fields. Regarding "algorithms" as an appropriate domain for regulation implies controlling thinking. Law usually controls behavior, rather than thought, for both deontological and pragmatic reasons, and many of them. It would be useful to clarify both what you are talking about, and what you propose to regulate.

The result is that privately developed algorithms are increasingly relied upon in decision-making across a broad array of industries in the United States, not just in securities trading and targeted advertising. Companies use algorithmic profiling to target and identify “ideal” job applicants, as well as to make training, compensation, promotion, and termination decisions about current employees (over whom management exercises increasingly broad and intrusive surveillance rights). Judges consider insights from predictive algorithms in sentencing decisions, despite the fact that these tools are trained on large data sets reflective of a systemically discriminatory criminal justice system. Housing, policing, the list goes on. Despite the potentially significant impact that these algorithms could have on our lives, their developers, empowered by trade secrets law, staunchly refuse to reveal the source code.

>
>
Algorithms and statistical profiling now guide (or supplant) human decision-making in a wide variety of fora, often with guiding or determinative effects on individual freedoms. Despite the significant impact that these algorithms will have on our lives and the potential for their proprietors to manipulate en masse, very little attention is paid to the absence of accountability for these social impacts. For the most part, these processes are invisible to those whose freedoms they have restricted. Affected individuals typically have no right to notice, consent, explanation, nor any ability to effectively dispute the applicability, efficacy, or disparate impact of the methodology. Developers, empowered by trade secrets law, staunchly refuse to reveal the source code of the tools that are shaping our future.
 

The Value of Transparency

Changed:
<
<
Reframing the right to privacy in terms of individual autonomy rather than secrecy is a necessary step towards understanding how to effectively regulate the abuse of profiling in data analytics without unnecessarily impeding technological development and the innumerable benefits that could be derived from algorithmic insights. Individual autonomy and corporate accountability would be preserved far more effectively and sustainably through algorithmic transparency than through data secrecy.

What is the difference between "algorithmic transparency" and use of free software?

When privacy is conceptualized as a right to secrecy, personal data is viewed in property terms and the regulatory solution appears to lie in enhancing individuals’ abilities to restrict access to their personal information. Algorithms will continue to pervade decision-making because the promise of economic efficiency remains. However, assuming that individuals exercise this new right to exclude, these algorithms will be trained on data sets which are increasingly scant and biased. Nonetheless, profiles will be developed and used to make statistical inferences and predictions about any individual, regardless of whether that particular individual was able to prevent her own data from being collected and used to train the algorithm.

Alternatively, individual autonomy is strengthened by improving individuals’ access to information so that they can make informed decisions. Thus, regulations designed to protect individual autonomy should emphasize transparency and attempt to reduce informational asymmetries. Individuals should be notified when the choices and opportunities available to them may be impacted by algorithmic insights. Individuals should have a right to an explanation of how the algorithm functions, including the data on which it was trained, the “target variables” it is designed to identify, and the inferences drawn from the statistical correlations that the algorithm has found.

Transparency would facilitate refinement of algorithms to more accurately achieve their intended goals and to address biases overlooked by developers. Transparency would also provide a means of ensuring that algorithmic insights are not used in an arbitrary or determinative manner with respect to limiting individuals’ access to choices and opportunities simply because they fall on the wrong side of a statistical inference. If enhanced market efficiency is truly the goal of algorithmic decision-making, then the exacerbation of informational disparities will only lead us in the wrong direction.

>
>
We must effectively regulate the abuse of profiling in data analytics without unnecessarily impeding technological development and the innumerable benefits to be derived from algorithmic insights. Individual autonomy, societal freedoms, and corporate accountability would be preserved far more effectively and sustainably through algorithmic transparency than proprietary control.
 
Changed:
<
<
>
>
When data privacy is conceptualized within individual rights to ‘anonymity’ and deletion, personal data is framed in quasi-property terms and the regulatory solution to preserving autonomy appears to lie in enhancing individuals’ abilities to control access to their personal information. Even if such control could be exercised effectively, individuals remain vulnerable to privacy-infringing inferences of predictive algorithms and limitations on their own autonomy. Profiles will continue to be developed and used to make statistical inferences and predictions about any individual, regardless of whether that particular individual was able to prevent her own data from being collected and used to train the algorithm. These algorithms will be trained on data sets which are increasingly skewed and those attempting to enforce their own data privacy may find themselves increasingly marginalized and starved of algorithmically controlled opportunities.
 
Changed:
<
<
The draft seems to me to be organized around showing that the autonomy component of privacy is important, and that contemporary legislation has mostly been instead about "secrecy." I think this is wrong, for reasons interlineated above. Making the draft stronger involves either strengthening that aspect of the analysis or replacing it with a different formulation. The subsequent argument in favor of "algorithmic transparency" is familiar, but technically a little less straightforward than it looks. I have asked some questions above the answers to which might strengthen this portion of the draft.
>
>
Individual autonomy is strengthened by improving individuals’ access to information so that they can make informed decisions and contest erroneous assumptions. Thus, regulations designed to protect individual autonomy should emphasize transparency and attempt to reduce informational asymmetries. Individuals should be notified when the choices and opportunities available to them may be artificially limited by algorithmic insights. Individuals should have a right to an explanation of how the algorithm functions, including the data on which it was trained, the “target variables” it is designed to identify, and the inferences drawn from the statistical correlations that the algorithm has found. Algorithms used to guide choices and opportunities related to fundamental human rights should be continuously tested and held accountable for their social impacts. They should be open to the public and freely available for study and critique by scholars and government agencies.
 
Changed:
<
<
>
>
Transparency would facilitate refinement of algorithms to more accurately achieve their intended goal, address biases overlooked by developers, and ensure ethical data processing and applications. Transparency would also provide a means of ensuring that algorithmic insights are not used in an arbitrarily determinative manner with respect to limiting individuals’ freedoms simply because they fall on the wrong side of a statistical inference. If enhanced market efficiency is truly the goal of algorithmic decision-making, then the exacerbation of informational disparities will only lead us in the wrong direction.
 



EricaPedersenFirstEssay 2 - 24 Nov 2019 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
 
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 

Algorithmic Profiling and the Future of Individual Autonomy

Line: 13 to 11
 Policymakers in the United States are finally beginning to acknowledge the tech industry’s exploitation of users through extensive monitoring and data collection. Unfortunately, the current policy debate is narrowly focused on enshrining individual rights to notice and consent with respect to the collection and sale of their personal data (see e.g., California Consumer Privacy Act). These ‘protections’ belie legislators’ superficial understanding of privacy as the right to secrecy, an individual’s right to hide information from certain actors. This framing will ultimately undercut society’s ability to address the significant threats that big data pose to individual freedom, autonomy, and self-determination.
Added:
>
>
Not really secrecy, because their attention is also directed at the personal information people voluntarily give to information processors, not the messages they intend to keep secret.

 The fundamental importance of a right to privacy lies not in its facilitation of secrecy, but in its protection of individual autonomy and self-determination. In the seminal article considered to be the original source of American legal conceptions of a right to privacy, Warren and Brandeis emphasized that privacy “is in reality not the principle of private property, but that of an inviolate personality.”
Changed:
<
<
Preservation of individual autonomy is integral to both human freedom and functional democracy. Cambridge Analytica has proven the power of data analytics to conduct mass "psychological warfare" and manipulate human behavior to engineer a systemically destabilizing political outcome. Moreover, algorithmic profiling has increasingly emerged as a gatekeeper controlling individuals’ abilities to access to a wide array of choices and opportunities in the real world. For the most part, these processes are invisible to those whose freedoms they have restricted. Affected individuals typically have no right to notice, consent, explanation, nor any ability to effectively dispute the applicability, efficacy, or disparate impact of the methodology.
>
>
But writing from 1896 probably doesn't fully deal with the current situation. And that's a piece about the tort of invading privacy, which is really distinct, is it not?

Preservation of individual autonomy is integral to both human freedom and functional democracy. Cambridge Analytica has proven the power of data analytics to conduct mass "psychological warfare" and manipulate human behavior to engineer a systemically destabilizing political outcome.

That's rather a strong version of what was "proved." Perhaps a less aggressive statement of what we have learned would still be sufficient?

Moreover, algorithmic profiling has increasingly emerged as a gatekeeper controlling individuals’ abilities to access to a wide array of choices and opportunities in the real world. For the most part, these processes are invisible to those whose freedoms they have restricted. Affected individuals typically have no right to notice, consent, explanation, nor any ability to effectively dispute the applicability, efficacy, or disparate impact of the methodology.

 By framing the right to privacy as a right to secrecy, modern American lawmakers obscure and insulate the broader individual and societal harms inherent to current methods of monetizing personal data. This unfortunate fact is particularly obvious when one considers the effect that legislation like the CCPA would have on the thriving industry of predictive algorithms. Data collection is only one step in the development of the algorithms which now guide (or supplant) human decision-making in many areas of our lives. We cannot preserve individual autonomy without addressing the ways in which data are processed and the applications of these statistical inferences in the real world.
Added:
>
>
I don't think that explaining this in terms of over-protection of secrecy per se works very well, given that the statutes and your factual examples aren't about secrecy. In the latter portion of the paragraph, you appear to be arguing that a statute is faulty if it tries to solve one problem but doesn't solve another. That an attempted solution is partial is not evidence that it is faulty.

 

Potential Harms of Data Analytics and Algorithmic Profiling

Algorithms are an economically efficient way to analyze huge data sets and gain new insights based on complex statistical correlations. Algorithms are frequently used to infer or predict an individual’s personal preferences, interests, behavior, attitudes, movements, or health. The Working Party on the Protection of Individuals with Regard to the Processing of Personal Data defines profiling as the “automated processing of personal data for evaluating personal aspects, in particular to analyse or make predictions about individuals.” Although the GDPR attempts to restrict automated decision-making and decision-making based on profiling, the United States has been much slower to acknowledge such autonomy-compromising applications of technology.

Added:
>
>
"Algorithms" are computer programs. Running computer programs in order to draw inferences from data is thinking. Scientific thinking, in particular, cannot proceed in any other way in many fields. Regarding "algorithms" as an appropriate domain for regulation implies controlling thinking. Law usually controls behavior, rather than thought, for both deontological and pragmatic reasons, and many of them. It would be useful to clarify both what you are talking about, and what you propose to regulate.

 The result is that privately developed algorithms are increasingly relied upon in decision-making across a broad array of industries in the United States, not just in securities trading and targeted advertising. Companies use algorithmic profiling to target and identify “ideal” job applicants, as well as to make training, compensation, promotion, and termination decisions about current employees (over whom management exercises increasingly broad and intrusive surveillance rights). Judges consider insights from predictive algorithms in sentencing decisions, despite the fact that these tools are trained on large data sets reflective of a systemically discriminatory criminal justice system. Housing, policing, the list goes on. Despite the potentially significant impact that these algorithms could have on our lives, their developers, empowered by trade secrets law, staunchly refuse to reveal the source code.

The Value of Transparency

Reframing the right to privacy in terms of individual autonomy rather than secrecy is a necessary step towards understanding how to effectively regulate the abuse of profiling in data analytics without unnecessarily impeding technological development and the innumerable benefits that could be derived from algorithmic insights. Individual autonomy and corporate accountability would be preserved far more effectively and sustainably through algorithmic transparency than through data secrecy.

Added:
>
>
What is the difference between "algorithmic transparency" and use of free software?

 When privacy is conceptualized as a right to secrecy, personal data is viewed in property terms and the regulatory solution appears to lie in enhancing individuals’ abilities to restrict access to their personal information. Algorithms will continue to pervade decision-making because the promise of economic efficiency remains. However, assuming that individuals exercise this new right to exclude, these algorithms will be trained on data sets which are increasingly scant and biased. Nonetheless, profiles will be developed and used to make statistical inferences and predictions about any individual, regardless of whether that particular individual was able to prevent her own data from being collected and used to train the algorithm.

Alternatively, individual autonomy is strengthened by improving individuals’ access to information so that they can make informed decisions. Thus, regulations designed to protect individual autonomy should emphasize transparency and attempt to reduce informational asymmetries. Individuals should be notified when the choices and opportunities available to them may be impacted by algorithmic insights. Individuals should have a right to an explanation of how the algorithm functions, including the data on which it was trained, the “target variables” it is designed to identify, and the inferences drawn from the statistical correlations that the algorithm has found.

Line: 36 to 64
 Transparency would facilitate refinement of algorithms to more accurately achieve their intended goals and to address biases overlooked by developers. Transparency would also provide a means of ensuring that algorithmic insights are not used in an arbitrary or determinative manner with respect to limiting individuals’ access to choices and opportunities simply because they fall on the wrong side of a statistical inference. If enhanced market efficiency is truly the goal of algorithmic decision-making, then the exacerbation of informational disparities will only lead us in the wrong direction.
Added:
>
>

The draft seems to me to be organized around showing that the autonomy component of privacy is important, and that contemporary legislation has mostly been instead about "secrecy." I think this is wrong, for reasons interlineated above. Making the draft stronger involves either strengthening that aspect of the analysis or replacing it with a different formulation. The subsequent argument in favor of "algorithmic transparency" is familiar, but technically a little less straightforward than it looks. I have asked some questions above the answers to which might strengthen this portion of the draft.

 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.

EricaPedersenFirstEssay 1 - 11 Oct 2019 - Main.EricaPedersen
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Algorithmic Profiling and the Future of Individual Autonomy

-- By EricaPedersen - 11 Oct 2019

Defining Privacy for the Internet Age: Secrecy vs. Autonomy

Policymakers in the United States are finally beginning to acknowledge the tech industry’s exploitation of users through extensive monitoring and data collection. Unfortunately, the current policy debate is narrowly focused on enshrining individual rights to notice and consent with respect to the collection and sale of their personal data (see e.g., California Consumer Privacy Act). These ‘protections’ belie legislators’ superficial understanding of privacy as the right to secrecy, an individual’s right to hide information from certain actors. This framing will ultimately undercut society’s ability to address the significant threats that big data pose to individual freedom, autonomy, and self-determination.

The fundamental importance of a right to privacy lies not in its facilitation of secrecy, but in its protection of individual autonomy and self-determination. In the seminal article considered to be the original source of American legal conceptions of a right to privacy, Warren and Brandeis emphasized that privacy “is in reality not the principle of private property, but that of an inviolate personality.”

Preservation of individual autonomy is integral to both human freedom and functional democracy. Cambridge Analytica has proven the power of data analytics to conduct mass "psychological warfare" and manipulate human behavior to engineer a systemically destabilizing political outcome. Moreover, algorithmic profiling has increasingly emerged as a gatekeeper controlling individuals’ abilities to access to a wide array of choices and opportunities in the real world. For the most part, these processes are invisible to those whose freedoms they have restricted. Affected individuals typically have no right to notice, consent, explanation, nor any ability to effectively dispute the applicability, efficacy, or disparate impact of the methodology.

By framing the right to privacy as a right to secrecy, modern American lawmakers obscure and insulate the broader individual and societal harms inherent to current methods of monetizing personal data. This unfortunate fact is particularly obvious when one considers the effect that legislation like the CCPA would have on the thriving industry of predictive algorithms. Data collection is only one step in the development of the algorithms which now guide (or supplant) human decision-making in many areas of our lives. We cannot preserve individual autonomy without addressing the ways in which data are processed and the applications of these statistical inferences in the real world.

Potential Harms of Data Analytics and Algorithmic Profiling

Algorithms are an economically efficient way to analyze huge data sets and gain new insights based on complex statistical correlations. Algorithms are frequently used to infer or predict an individual’s personal preferences, interests, behavior, attitudes, movements, or health. The Working Party on the Protection of Individuals with Regard to the Processing of Personal Data defines profiling as the “automated processing of personal data for evaluating personal aspects, in particular to analyse or make predictions about individuals.” Although the GDPR attempts to restrict automated decision-making and decision-making based on profiling, the United States has been much slower to acknowledge such autonomy-compromising applications of technology.

The result is that privately developed algorithms are increasingly relied upon in decision-making across a broad array of industries in the United States, not just in securities trading and targeted advertising. Companies use algorithmic profiling to target and identify “ideal” job applicants, as well as to make training, compensation, promotion, and termination decisions about current employees (over whom management exercises increasingly broad and intrusive surveillance rights). Judges consider insights from predictive algorithms in sentencing decisions, despite the fact that these tools are trained on large data sets reflective of a systemically discriminatory criminal justice system. Housing, policing, the list goes on. Despite the potentially significant impact that these algorithms could have on our lives, their developers, empowered by trade secrets law, staunchly refuse to reveal the source code.

The Value of Transparency

Reframing the right to privacy in terms of individual autonomy rather than secrecy is a necessary step towards understanding how to effectively regulate the abuse of profiling in data analytics without unnecessarily impeding technological development and the innumerable benefits that could be derived from algorithmic insights. Individual autonomy and corporate accountability would be preserved far more effectively and sustainably through algorithmic transparency than through data secrecy.

When privacy is conceptualized as a right to secrecy, personal data is viewed in property terms and the regulatory solution appears to lie in enhancing individuals’ abilities to restrict access to their personal information. Algorithms will continue to pervade decision-making because the promise of economic efficiency remains. However, assuming that individuals exercise this new right to exclude, these algorithms will be trained on data sets which are increasingly scant and biased. Nonetheless, profiles will be developed and used to make statistical inferences and predictions about any individual, regardless of whether that particular individual was able to prevent her own data from being collected and used to train the algorithm.

Alternatively, individual autonomy is strengthened by improving individuals’ access to information so that they can make informed decisions. Thus, regulations designed to protect individual autonomy should emphasize transparency and attempt to reduce informational asymmetries. Individuals should be notified when the choices and opportunities available to them may be impacted by algorithmic insights. Individuals should have a right to an explanation of how the algorithm functions, including the data on which it was trained, the “target variables” it is designed to identify, and the inferences drawn from the statistical correlations that the algorithm has found.

Transparency would facilitate refinement of algorithms to more accurately achieve their intended goals and to address biases overlooked by developers. Transparency would also provide a means of ensuring that algorithmic insights are not used in an arbitrary or determinative manner with respect to limiting individuals’ access to choices and opportunities simply because they fall on the wrong side of a statistical inference. If enhanced market efficiency is truly the goal of algorithmic decision-making, then the exacerbation of informational disparities will only lead us in the wrong direction.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 3r3 - 05 Jan 2020 - 21:33:34 - EricaPedersen
Revision 2r2 - 24 Nov 2019 - 15:19:56 - EbenMoglen
Revision 1r1 - 11 Oct 2019 - 13:20:35 - EricaPedersen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM