Computers, Privacy & the Constitution

View   r20  >  r19  >  r18  >  r17  >  r16  >  r15  ...
AnthonyFikryFirstPaper 20 - 27 Apr 2024 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<
A Comparative Examination of Data Privacy Laws in the United States and the European Union, with Particular Reference to the General Data Protection Regulation
>
>

A Comparative Examination of Data Privacy Laws in the United States and the European Union, with Particular Reference to the General Data Protection Regulation

  In an age where personal data has become a valuable commodity for advertisers and organizations at large, the need for robust data privacy legislation has emerged as a defining issue of our time. The General Data Protection Regulation (GDPR), enacted by the European Union (EU) in 2018, represents a comprehensive framework aimed at safeguarding individuals' privacy rights in the digital world. In contradistinction to the EU, the United States lacks a comprehensive data privacy law applicable to all types of data and domestic companies. Instead, data privacy laws in the U.S. tend to be more fragmentary with various state-level regulations governing different sectors and types of data. In the remarks that follow, I shall provide an overview of data privacy laws in the U.S. and the EU, and then subsequently examine whether the U.S. could adopt the GDPR or a similar federal statute to address the shortcomings of its current data privacy regulations.
Line: 15 to 15
  The question arises as to whether the US would stand to gain from the adoption of a GDPR-like statute. Proponents may argue that the need for such a statute is even more pronounced in America as most Big Tech companies are American. Others may argue that a statute akin to the GDPR would be defective in the U.S. because it is simply too different from the EU in terms of its institutions and values. A statute such as the GDPR is arguably at variance with America's capitalistic ethos. The GDPR’s stringent requirements and its compliance costs, coupled with potential fines for noncompliance, constitute market distortions. As such, so the argument goes, they may prove inimical to the free market competition which lies at the heart of America’s capitalistic economic system. Moreover, small and medium-sized businesses would bear the brunt of such costs, which increases barriers to entry in the market, and thus arguably stifles the competition and innovation on which America places so high a premium.

As data breaches and privacy concerns continue to arise, the U.S. faces increasing pressure to solidify its data privacy regulations. While the adoption of a GDPR-like statute presents challenges owing to cultural, legal, and economic disparities, it also offers an opportunity to enhance individual rights and bolster consumer trust in the digital marketplace. By considering and navigating the complexities inherent in the potential adoption of robust data privacy legislation such as the GDPR, the U.S. can work towards establishing itself as a global leader in safeguarding data privacy rights.

Added:
>
>

This draft exaggerates the social difference between opt-in and opt-out "data protection." It does not actually inquire into the regimes regarding state listening (which all GDPR implementing national legislation leaves unaffected). It does not discuss the actual "safe-harboring" approach to the global status of supposedly GDPR-covered personal information. It could afford to lose 150 words of introductory or otherwise unnecessary explication in order to have room to deal with these and similar further considerations.

META TOPICMOVED by="EbenMoglen" date="1714246496" from="CompPrivConst.TWikiGuestFirstPaper" to="CompPrivConst.AnthonyFikryFirstPaper"

AnthonyFikryFirstPaper 19 - 06 Apr 2024 - Main.AnthonyFikry
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<

Evasive Maneuvers -- How the Privacy Regulations Can Cover Government Actors in the Future

>
>
A Comparative Examination of Data Privacy Laws in the United States and the European Union, with Particular Reference to the General Data Protection Regulation
 
Changed:
<
<
-- By Morgan Carter - 05 Mar 2024
>
>
In an age where personal data has become a valuable commodity for advertisers and organizations at large, the need for robust data privacy legislation has emerged as a defining issue of our time. The General Data Protection Regulation (GDPR), enacted by the European Union (EU) in 2018, represents a comprehensive framework aimed at safeguarding individuals' privacy rights in the digital world. In contradistinction to the EU, the United States lacks a comprehensive data privacy law applicable to all types of data and domestic companies. Instead, data privacy laws in the U.S. tend to be more fragmentary with various state-level regulations governing different sectors and types of data. In the remarks that follow, I shall provide an overview of data privacy laws in the U.S. and the EU, and then subsequently examine whether the U.S. could adopt the GDPR or a similar federal statute to address the shortcomings of its current data privacy regulations.
 
Changed:
<
<
The California Consumer Privacy Act (CCPA) and other data privacy regulations across the nation were passed with a goal in mind: to protect consumers from the extensive sharing and selling of their data by companies profiting from that personal, and often private, information without consumer knowledge or consent. Similar legislation passed in the years following the CCPA in states like Virginia, Colorado, Utah, and Connecticut. While there are variances across the different consumer data regulations, a consistency across the board is that they are considered progressive steps forward, intended to allow transparency and provide protection for their consumers. In the time that has passed since the passing of the CCPA (now California Privacy Rights Act (CPRA)), businesses have worked to comply with the obligations imposed, and it seems that more and more legislation aiming to protect consumer data will emerge in the coming years. But the successful implementation of these state privacy regulations has enabled a far more dangerous and pervasive form of data collection, share, and sale of consumer data. It was recently revealed that the United States government has been “buying up reams of consumer data — information scraped from cellphones, social media profiles, internet ad exchanges and other open sources — and deploying it for often-clandestine purposes like law enforcement and national security in the U.S. and abroad.” The digital footprint that any American citizen has, “[t]he places you go, the websites you visit, the opinions you post — all collected and legally sold to federal agencies.” Id. There is considerable danger and justified discomfort in the knowledge that the United States government is quietly purchasing and collecting consumer data from companies. This data can be, and is, “used for everything from rounding up undocumented immigrants or detecting border tunnels. We’ve also seen data used for man hunting or identifying specific people in the vicinity of crimes or known criminal activity.” See also. We risk turning into an even bigger surveillance state than we already are with government purchasing consumer data, and many of those risks are even higher for minority populations. [see also Carter, forthcoming Columbia Law Review, March 2024]. While consumers may be protected from (some) of the predatory share and sale habits of for-profit businesses thanks to the existing privacy regulatory framework, the United States government has found a way to access this information while not being subject to the requirements of the data privacy regulations specifically designed to avoid such collection as this. This was likely accomplished through a few means. One, the privacy regulations apply to companies that meet certain criteria, and the government and government contractors were probably conducting business with companies that fell outside of these criteria. See id. For example, the CPRA applies “to any for-profit organization, which may do business in the State of California,” (emphasis omitted) and “applies to businesses that: [1] Have a gross annual revenue of over $25 million in the preceding calendar year, or [2] Buy, receive, or sell the personal information of 100,000 or more California residents, households, or devices, or [3] Derive 50% or more of their annual revenue from selling or sharing California residents' personal information[.]” To avoid the companies who would be subject to regulations like the CPRA, these government organizations and associates need only coordinate with the businesses just outside of these parameters. For instance, the government gleans consumer data from “tiny, obscure data brokers,” with “very little public-facing presence and almost no direct consumer relationship. Some of these companies focus on consumer data. Some focus on social data. Some focus on movement data.” The second way that government was able to get around the privacy regulations is merely by taking advantage of the functionality of the “opt-out” mechanism. These regulations offer “opt-out” provisions that require businesses that qualify under the legislation to offer consumers the option to either opt out of the sharing of their data, or to be able to see to whom their data is sold. In the European Union, the General Data Protection Regulation (GDPR) uses an “opt-in” format that requires consumers to intentionally choose to allow their data to be collected and shared by the pages that they visit. In the United States, it is the exact opposite. As a result of this, more data is collected because opting into data collection is the default option. The companies that evade privacy regulations are able to exploit this default option and collect large swaths of consumer data, and profit off of selling it to third parties such as government and government contractors. For current and upcoming consumer data privacy regulations to improve upon their goal of protecting consumer data and increasing transparency about where consumer data is going, future amendments and regulations should adjust the criteria for qualification under the privacy regulations and should make efforts to move our nation to an opt-in system. On the former point, the current revenue requirement should be lowered to include more businesses than it does at present. Considering the government is currently buying data from “tiny” data brokers, a smaller revenue requirement could help to include some of these parties. Alternatively, the number of residents whose data is bought, received, or sold could be reduced for the same purpose. Including for-profit businesses that receive even 10,000 residents’ personal information, instead of the current 100,000, could be included and help to reduce the degree to which the government is able to quietly access this information. Also, moving to an opt-out system would at the very least reduce the number of people whose consumer data and private information is caught up in the share and sale data market. The conscious choice to opt into data share would reduce the number of people who do it, and agency could be successfully returned to large swaths of consumers. The government has been taking advantage of the data privacy regulatory framework as it is today, but there is still time to change it to avoid an even stronger surveillance state.
>
>
The GDPR is the most detailed and rigorous data privacy regulatory regime in the world, applying even to entities outside of the EU provided that such entities collect data of EU citizens and residents. The GDPR governs the acquisition, management, and processing of personal data, and imposes the consent of data subjects as a key requirement. Pursuant to the GDPR, companies may only collect data on citizens or residents of the EU with their explicit, informed consent and they must explain to them in simple terms how their data is being used. The GDPR additionally affords data subjects the right to request copies of their data and to request its permanent deletion. Failure to comply with the GDPR may result in fines of up to the higher of ¤20 million or 4% of global revenue. The adoption and enforcement of the GDPR testifies to the premium placed on data privacy within the EU.

Although a GDPR-like statute has not been adopted at the federal level in the U.S., many states, inspired by the GDPR, have enacted data privacy laws, such as the California Consumer Protection Act and the Virginia Consumer Data Protection Act. However, the GDPR applies to a wider range of data, such as cookie data, location information, and IP addresses, whereas data privacy laws on this side of the Atlantic protect, in the main, the health and financial information of data subjects.

There are several key constitutional and cultural considerations that may shed light on the different approaches to data privacy regulation adopted respectively by the U.S. and the EU. The EU Charter of Fundamental Rights, for instance, protects data privacy as a fundamental right: "Everyone has the right to respect for his private and family life, his home and his correspondence". However, no equivalent provision is explicitly found in the U.S. Constitution, although some have viewed the Fourth Amendment as providing a basis for the right to data privacy; the substantive due process inferred from the Fifth and Fourteenth Amendments could also serve as such a basis, but its future remains uncertain in the aftermath of the Dobbs decision. This commitment to privacy in the EU has profound historical roots, stemming in part from the abuse of individuals' privacy in the 20th century, particularly in fascist and communist regimes.

A deeper explanation for the discrepancy lies in divergent approaches to the power and scope of government. In the U.S., the default legal order is characterized by an absence of the law, and a greater premium is placed on constraining the power and scope of government, including the federal government. The EU, by contrast, has tended to favor government intervention to a far greater degree, as reflected in the extensive set of social security nets it has in place. In contrast, the U.S. has traditionally taken a more laissez-faire approach that tends to be more favorable to companies that collect and use personal data. In the U.S., there is greater scope for commercial use of personal data, even at the expense of privacy rights. Recent years have seen public opinion shifting gradually towards supporting better protection of personal data as data privacy violations continue to come to the fore, but the underlying cultural differences discussed above continue to pose an impediment to bringing U.S. data privacy laws in alignment with the GDPR.

The question arises as to whether the US would stand to gain from the adoption of a GDPR-like statute. Proponents may argue that the need for such a statute is even more pronounced in America as most Big Tech companies are American. Others may argue that a statute akin to the GDPR would be defective in the U.S. because it is simply too different from the EU in terms of its institutions and values. A statute such as the GDPR is arguably at variance with America's capitalistic ethos. The GDPR’s stringent requirements and its compliance costs, coupled with potential fines for noncompliance, constitute market distortions. As such, so the argument goes, they may prove inimical to the free market competition which lies at the heart of America’s capitalistic economic system. Moreover, small and medium-sized businesses would bear the brunt of such costs, which increases barriers to entry in the market, and thus arguably stifles the competition and innovation on which America places so high a premium.

As data breaches and privacy concerns continue to arise, the U.S. faces increasing pressure to solidify its data privacy regulations. While the adoption of a GDPR-like statute presents challenges owing to cultural, legal, and economic disparities, it also offers an opportunity to enhance individual rights and bolster consumer trust in the digital marketplace. By considering and navigating the complexities inherent in the potential adoption of robust data privacy legislation such as the GDPR, the U.S. can work towards establishing itself as a global leader in safeguarding data privacy rights.


AnthonyFikryFirstPaper 18 - 05 Mar 2024 - Main.MorganC
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<

Lovecraftian Corporations

>
>

Evasive Maneuvers -- How the Privacy Regulations Can Cover Government Actors in the Future

 
Changed:
<
<
-- By Anderson Dalmeus - 01 Mar 2024
>
>
-- By Morgan Carter - 05 Mar 2024
 
Deleted:
<
<
Corporations have become nearly ubiquitous in the every day life of Americans. They are understood by their branding their products and services and the people who work there. The corporate entity can be understood as a kind of legal fiction ordained by the bureaucracy of the state. However we might also consider taking the phrase “corporate entity” in a very literal sense. That is when we say that google is a corporate entity we do not merely mean to say that it is a short hand for understanding the culmination of disparate processes and projects that form google but rather that google does in fact take on a life of its own by the process of incorporation. This may seem like a tortured conclusion and even now as I write it I can feel myself stretching the phrase “take on a life of its own” but it isn’t an unprecedented interpretation either. This thinking should really just be considered an extension of the reasoning that led the Supreme Court to their decisions in Santa Clara County v. Southern Pacific Railroad and Citizens United v the FEC. Corporations law is designed for the purpose of manifesting the capital C “Corporation” as a distinct legal thing from its property and its labor and these cases tell us that corporations also have 1st amendment rights to free speech and 14th amendment rights to protections from the state. However I will have to break with the Supreme Court in one small detail of their characterization of corporate entities. Rather than viewing them as persons or having personhood they should be viewed the same way one views the abominations of a Lovecraftian horror. They are creatures that exist abstractly and even to begin to perceive them in their true forms fundamentally alters the mind away from the natural reasoning of a human. Their machinations are unknowable in their entirety and their goals are not always anthropomorphic. One might consider that the shareholders or the board of a corporation are ultimately responsible for its decisions but that is like saying the neurons in a person’s brain are responsible for their decisions. It is doubtless true but reveals little about the actual person. The same way the cells of the body come together to form a whole person without ever arguably being able to engage with or understand that they do form that person the workers and owners of a corporation form an 3entity that they will never be able to engage with directly. It is why the law imposes fiduciary duties on board members. Because the corporation cannot speak to any individual piece that makes it any more than I can directly talk to my own cells and these duties prevent the entities constituents from becoming cancerous to it. While it may yet still seem farfetched to claim that a corporation literally does exist as a result of the law, is that more farfetched than selling your time to that corporation? If the corporation is not literally real then where does a someone’s time go when they spend all of those hours working for it? Obviously not to that person because then they’d be working for themselves. That person feeds their time, their energy, their hopes, their dreams and nearly everything they hold dear to that corporation but the corporation doesn’t actually exist. What is to be said of a world where a person can give all that they are into nothing? Certainly we would call this madness. But even then we are simply led back to the Lovecraftian description of the corporation and its impact on those who engage with it. One pours their blood sweat and tears into the void and while it would be an emotionally satisfying to say that the void isn’t there the apparent absence of all the effort poured in is evidence that something is there. If that void were not there to speak of then all that was poured into it would also be there. Likewise it is evident that corporations do exist because something seems to be draining humanity of all its energy and productivity. Something is eating the planets resources at a rate that no mere organism could. One might say humans could do that but as stated before humans are the to corporations as cells are to the human. Individual cells could not do what the entire human could and you could have as many cells in the human body as you would like but they all only move when the greater human acts intentionally. This means that corporations the wretched conglomeration of human activity and productivity are inevitable. The horrific conclusion of a Lovecraftian story is that the eldritch creatures pulling the strings of society are unavoidable and cannot be defeated by humanity. In the Lovecraftian lens this is because they are like natural disasters. Devastating though they may be they are merely a part of life that are simply beyond the control of mankind. But I will take it a step further still. The corporate entity is not merely a force of nature mankind must contend with but rather it is the human nature of any group that grows to be a size large enough to have that complexity. The corporate creature awakens at some point after enough human mass and activity comes about and it begins to consume all that there is to consume. The only choice on the individual level for humans is whether to be part of the entity or to go off the grid.
 \ No newline at end of file
Added:
>
>
The California Consumer Privacy Act (CCPA) and other data privacy regulations across the nation were passed with a goal in mind: to protect consumers from the extensive sharing and selling of their data by companies profiting from that personal, and often private, information without consumer knowledge or consent. Similar legislation passed in the years following the CCPA in states like Virginia, Colorado, Utah, and Connecticut. While there are variances across the different consumer data regulations, a consistency across the board is that they are considered progressive steps forward, intended to allow transparency and provide protection for their consumers. In the time that has passed since the passing of the CCPA (now California Privacy Rights Act (CPRA)), businesses have worked to comply with the obligations imposed, and it seems that more and more legislation aiming to protect consumer data will emerge in the coming years. But the successful implementation of these state privacy regulations has enabled a far more dangerous and pervasive form of data collection, share, and sale of consumer data. It was recently revealed that the United States government has been “buying up reams of consumer data — information scraped from cellphones, social media profiles, internet ad exchanges and other open sources — and deploying it for often-clandestine purposes like law enforcement and national security in the U.S. and abroad.” The digital footprint that any American citizen has, “[t]he places you go, the websites you visit, the opinions you post — all collected and legally sold to federal agencies.” Id. There is considerable danger and justified discomfort in the knowledge that the United States government is quietly purchasing and collecting consumer data from companies. This data can be, and is, “used for everything from rounding up undocumented immigrants or detecting border tunnels. We’ve also seen data used for man hunting or identifying specific people in the vicinity of crimes or known criminal activity.” See also. We risk turning into an even bigger surveillance state than we already are with government purchasing consumer data, and many of those risks are even higher for minority populations. [see also Carter, forthcoming Columbia Law Review, March 2024]. While consumers may be protected from (some) of the predatory share and sale habits of for-profit businesses thanks to the existing privacy regulatory framework, the United States government has found a way to access this information while not being subject to the requirements of the data privacy regulations specifically designed to avoid such collection as this. This was likely accomplished through a few means. One, the privacy regulations apply to companies that meet certain criteria, and the government and government contractors were probably conducting business with companies that fell outside of these criteria. See id. For example, the CPRA applies “to any for-profit organization, which may do business in the State of California,” (emphasis omitted) and “applies to businesses that: [1] Have a gross annual revenue of over $25 million in the preceding calendar year, or [2] Buy, receive, or sell the personal information of 100,000 or more California residents, households, or devices, or [3] Derive 50% or more of their annual revenue from selling or sharing California residents' personal information[.]” To avoid the companies who would be subject to regulations like the CPRA, these government organizations and associates need only coordinate with the businesses just outside of these parameters. For instance, the government gleans consumer data from “tiny, obscure data brokers,” with “very little public-facing presence and almost no direct consumer relationship. Some of these companies focus on consumer data. Some focus on social data. Some focus on movement data.” The second way that government was able to get around the privacy regulations is merely by taking advantage of the functionality of the “opt-out” mechanism. These regulations offer “opt-out” provisions that require businesses that qualify under the legislation to offer consumers the option to either opt out of the sharing of their data, or to be able to see to whom their data is sold. In the European Union, the General Data Protection Regulation (GDPR) uses an “opt-in” format that requires consumers to intentionally choose to allow their data to be collected and shared by the pages that they visit. In the United States, it is the exact opposite. As a result of this, more data is collected because opting into data collection is the default option. The companies that evade privacy regulations are able to exploit this default option and collect large swaths of consumer data, and profit off of selling it to third parties such as government and government contractors. For current and upcoming consumer data privacy regulations to improve upon their goal of protecting consumer data and increasing transparency about where consumer data is going, future amendments and regulations should adjust the criteria for qualification under the privacy regulations and should make efforts to move our nation to an opt-in system. On the former point, the current revenue requirement should be lowered to include more businesses than it does at present. Considering the government is currently buying data from “tiny” data brokers, a smaller revenue requirement could help to include some of these parties. Alternatively, the number of residents whose data is bought, received, or sold could be reduced for the same purpose. Including for-profit businesses that receive even 10,000 residents’ personal information, instead of the current 100,000, could be included and help to reduce the degree to which the government is able to quietly access this information. Also, moving to an opt-out system would at the very least reduce the number of people whose consumer data and private information is caught up in the share and sale data market. The conscious choice to opt into data share would reduce the number of people who do it, and agency could be successfully returned to large swaths of consumers. The government has been taking advantage of the data privacy regulatory framework as it is today, but there is still time to change it to avoid an even stronger surveillance state.

AnthonyFikryFirstPaper 17 - 03 Mar 2024 - Main.AvrahamTsikhanovski
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"

Lovecraftian Corporations


AnthonyFikryFirstPaper 16 - 01 Mar 2024 - Main.AndersonDalmeus
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<

High in the sky: cloud gaming and privacy

>
>

Lovecraftian Corporations

 
Changed:
<
<
-- By RobertLarese - 01 Mar 2024
>
>
-- By Anderson Dalmeus - 01 Mar 2024
 
Deleted:
<
<
Computer gaming - to be deliberately cavalier, this includes gaming on a PC or a console, or even your smart phone - is something of a binary. All forms may be played on a third-party server or local machine. Consider first the second means. To enjoy the thing to be enjoyed, another set of third parties typically provide access to group-playing, so called "going online," "online" or, simply "on" by connecting local machines together over the internet. Microsoft is one of these other third parties. Bill Gates is happy to sell you the local machine, and for an allegedly paltry ten dollars a month, provide the connectivity to other local machines. See https://www.xbox.com/en-US/live/gold. But this type of gaming is comprised of at least personal hardware running the game locally, should the "gamer" forego multiplayer. The twenty-first century heralded a competitor to this local machine plus internet collaboration model.

Enter cloud gaming.

Imagine an end user, the player, a pipe relaying the user's inputs to a remote machine, could be a server farm or another individual's personal computer, where the game is run, and a pipe that transfer outputs back to the user, over an internet connection strong enough to handle whatever absurdly sharp rendering the game requires for enjoyment. At its core, "[c]loud gaming . . . renders an interactive gaming application remotely in the cloud and streams the scenes as a video sequence back to the player over the Internet." https://ieeexplore.ieee.org/document/6574660.

I suppose the obvious question is whether Microsoft and others will move away from the personal hardware component. Netflix did. This paper does not concern that strategic decision. Netflix knows every button you press: They "mobilized the [cursor] and sent it into battle." https://www.churchillbookcollector.com/pages/books/006737/winston-s-churchill/mr-churchills-speech-in-the-house-of-commons-2nd-of-august-1944. Suppose we continue to consent, through apathy or willful blindness or sheer ignorance, to the counter-privacy model of internet consumerism. How then should we feel about cloud gaming?

Is this another tidal wave?

The appeal of cloud gaming is instant: It offers users high-powered computing without the computing limits of personal hardware. But it also has another appeal: Accessibility. By "fall 2020, cloud gaming services [were still] largely unavailable to those outside North America and Central Europe." https://project-paladin.org/.. Cloud gaming is not without implementation issues. Both "interaction latency" with respect to inputs and "streaming quality" of outputs. https://ieeexplore.ieee.org/document/6574660. If these challenges are overcome, users could experience a quality gaming experience without the upfront cost of a personal machine. This could doom the less privileged to a subscription model, conscripting them into a system where even their own accolades are not their own.

Assume away the subjugation of these acolytes for a moment. This may be a more reasonable assumption than it appears. Researchers at the University of Michigan in 2020 developed a method of cloud gaming, leveraging Moonlight and Google Cloud Platform (GCP), that delivered to users computer gaming at "approximately 60¢/hour when in use," after "approximately 500 hours" of free cloud gaming for the first three months of use. https://johnragone.medium.com/500-hours-of-free-4k-60-fps-cloud-gaming-with-gcp-and-moonlight-c796fa10f0a3.

There is still a problem. Cloud gaming allows "less powerful computational devices that are otherwise incapable of running high-quality games," e.g., a simple computer or even a smart phone, run such "high-quality games." https://ieeexplore.ieee.org/document/6574660. Just like Netflix, GCP, NVIDIA "GeForce Now," or whoever is providing the more powerful "computational device" knows what it is the user is doing. NVIDIA, too, has sent the cursor to battle. It "fights [not] for the users." https://www.imdb.com/title/tt0084827/quotes/?item=qt0406294.

Were you to investigate academic treatments of privacy concerns vis a vis clouding gaming, or even computing generally, password security and data breaches would occupy your field. Netflix musters their "password strength," power to ensure your data is safely in their hands for their use. Cloud gamers may expect the same passionate concerns from their providers, too. In many respects, Netflix may stand in for cloud gaming companies. "Cloud gaming companies collect and store a large amount of personal data from users, including their gaming preferences, purchase history, and personal information." https://www.lexology.com/library/detail.aspx?g=07715def-72bb-4181-88ba-2a9c6fa0646b. Consider again, the Netflix analog. When users had to order DVDs, Netflix's development of user preferences required at least some physical interaction; now, they even know what movies you mull over but ultimately pass on. This is a disaster.

A few television streaming services held out with respect to advertising until quite recently. Missing this revenue proved too much for even the most dutiful companies. Cloud gaming services are unsurprisingly doing the same. See https://www.thurrott.com/games/298440/nvidia-geforce-now-free-tier-is-getting-pre-roll-ads. When state legislatures as early as 2008 were passing laws regulating the content of video games because of their alleged harmful effects on the users, governments and private citizens should be just as alarmed with the sort of "personalized," predatory and targeted, advertising. See https://www.cga.ct.gov/2008/rpt/2008-R-0233.htm#:~:text=Several%20states%2C%20including%20California%2C%20Georgia,sale%20of%20such%20video%20games. If there is even an ounce of truth in the "social media is killing free will" mantra, then we should all be alarmed. See https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9597644/.

To attempt a clamp down on these breaches or return to the early years, when cloud gaming was small, bifurcated into a hobbyist faction and a nascent commercial industry, think NVIDIA "GeForce Now." Both were cottage industries. This is assuredly an either-or fallacy. Recall the researchers at University of Michigan who built a do-it-yourself free cloud gaming implementation, that was "applauded internationally by [several thousand] cloud gaming enthusiasts from Latin America to Singapore." https://project-paladin.org/. It appears then that the cloud gaming industry might be primed for a grassroots implementation of a more private cloud gaming deployment. The success of such an effort relies on widespread adoption, typically by word of mouth or other nonmainstream channels. Leveraging the amateur cloud gaming subculture, the cursor's power to observe and control could be minimized. Interested researchers would be wise to devote effort to this area.

 \ No newline at end of file
Added:
>
>
Corporations have become nearly ubiquitous in the every day life of Americans. They are understood by their branding their products and services and the people who work there. The corporate entity can be understood as a kind of legal fiction ordained by the bureaucracy of the state. However we might also consider taking the phrase “corporate entity” in a very literal sense. That is when we say that google is a corporate entity we do not merely mean to say that it is a short hand for understanding the culmination of disparate processes and projects that form google but rather that google does in fact take on a life of its own by the process of incorporation. This may seem like a tortured conclusion and even now as I write it I can feel myself stretching the phrase “take on a life of its own” but it isn’t an unprecedented interpretation either. This thinking should really just be considered an extension of the reasoning that led the Supreme Court to their decisions in Santa Clara County v. Southern Pacific Railroad and Citizens United v the FEC. Corporations law is designed for the purpose of manifesting the capital C “Corporation” as a distinct legal thing from its property and its labor and these cases tell us that corporations also have 1st amendment rights to free speech and 14th amendment rights to protections from the state. However I will have to break with the Supreme Court in one small detail of their characterization of corporate entities. Rather than viewing them as persons or having personhood they should be viewed the same way one views the abominations of a Lovecraftian horror. They are creatures that exist abstractly and even to begin to perceive them in their true forms fundamentally alters the mind away from the natural reasoning of a human. Their machinations are unknowable in their entirety and their goals are not always anthropomorphic. One might consider that the shareholders or the board of a corporation are ultimately responsible for its decisions but that is like saying the neurons in a person’s brain are responsible for their decisions. It is doubtless true but reveals little about the actual person. The same way the cells of the body come together to form a whole person without ever arguably being able to engage with or understand that they do form that person the workers and owners of a corporation form an 3entity that they will never be able to engage with directly. It is why the law imposes fiduciary duties on board members. Because the corporation cannot speak to any individual piece that makes it any more than I can directly talk to my own cells and these duties prevent the entities constituents from becoming cancerous to it. While it may yet still seem farfetched to claim that a corporation literally does exist as a result of the law, is that more farfetched than selling your time to that corporation? If the corporation is not literally real then where does a someone’s time go when they spend all of those hours working for it? Obviously not to that person because then they’d be working for themselves. That person feeds their time, their energy, their hopes, their dreams and nearly everything they hold dear to that corporation but the corporation doesn’t actually exist. What is to be said of a world where a person can give all that they are into nothing? Certainly we would call this madness. But even then we are simply led back to the Lovecraftian description of the corporation and its impact on those who engage with it. One pours their blood sweat and tears into the void and while it would be an emotionally satisfying to say that the void isn’t there the apparent absence of all the effort poured in is evidence that something is there. If that void were not there to speak of then all that was poured into it would also be there. Likewise it is evident that corporations do exist because something seems to be draining humanity of all its energy and productivity. Something is eating the planets resources at a rate that no mere organism could. One might say humans could do that but as stated before humans are the to corporations as cells are to the human. Individual cells could not do what the entire human could and you could have as many cells in the human body as you would like but they all only move when the greater human acts intentionally. This means that corporations the wretched conglomeration of human activity and productivity are inevitable. The horrific conclusion of a Lovecraftian story is that the eldritch creatures pulling the strings of society are unavoidable and cannot be defeated by humanity. In the Lovecraftian lens this is because they are like natural disasters. Devastating though they may be they are merely a part of life that are simply beyond the control of mankind. But I will take it a step further still. The corporate entity is not merely a force of nature mankind must contend with but rather it is the human nature of any group that grows to be a size large enough to have that complexity. The corporate creature awakens at some point after enough human mass and activity comes about and it begins to consume all that there is to consume. The only choice on the individual level for humans is whether to be part of the entity or to go off the grid.
 \ No newline at end of file

AnthonyFikryFirstPaper 15 - 01 Mar 2024 - Main.HafsahHanif
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<
High in the sky: cloud gaming and privacy
>
>

High in the sky: cloud gaming and privacy

-- By RobertLarese - 01 Mar 2024

 Computer gaming - to be deliberately cavalier, this includes gaming on a PC or a console, or even your smart phone - is something of a binary. All forms may be played on a third-party server or local machine. Consider first the second means. To enjoy the thing to be enjoyed, another set of third parties typically provide access to group-playing, so called "going online," "online" or, simply "on" by connecting local machines together over the internet. Microsoft is one of these other third parties. Bill Gates is happy to sell you the local machine, and for an allegedly paltry ten dollars a month, provide the connectivity to other local machines. See https://www.xbox.com/en-US/live/gold. But this type of gaming is comprised of at least personal hardware running the game locally, should the "gamer" forego multiplayer. The twenty-first century heralded a competitor to this local machine plus internet collaboration model.

AnthonyFikryFirstPaper 14 - 01 Mar 2024 - Main.RobertLarese
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<
test
>
>
High in the sky: cloud gaming and privacy

Computer gaming - to be deliberately cavalier, this includes gaming on a PC or a console, or even your smart phone - is something of a binary. All forms may be played on a third-party server or local machine. Consider first the second means. To enjoy the thing to be enjoyed, another set of third parties typically provide access to group-playing, so called "going online," "online" or, simply "on" by connecting local machines together over the internet. Microsoft is one of these other third parties. Bill Gates is happy to sell you the local machine, and for an allegedly paltry ten dollars a month, provide the connectivity to other local machines. See https://www.xbox.com/en-US/live/gold. But this type of gaming is comprised of at least personal hardware running the game locally, should the "gamer" forego multiplayer. The twenty-first century heralded a competitor to this local machine plus internet collaboration model.

Enter cloud gaming.

Imagine an end user, the player, a pipe relaying the user's inputs to a remote machine, could be a server farm or another individual's personal computer, where the game is run, and a pipe that transfer outputs back to the user, over an internet connection strong enough to handle whatever absurdly sharp rendering the game requires for enjoyment. At its core, "[c]loud gaming . . . renders an interactive gaming application remotely in the cloud and streams the scenes as a video sequence back to the player over the Internet." https://ieeexplore.ieee.org/document/6574660.

I suppose the obvious question is whether Microsoft and others will move away from the personal hardware component. Netflix did. This paper does not concern that strategic decision. Netflix knows every button you press: They "mobilized the [cursor] and sent it into battle." https://www.churchillbookcollector.com/pages/books/006737/winston-s-churchill/mr-churchills-speech-in-the-house-of-commons-2nd-of-august-1944. Suppose we continue to consent, through apathy or willful blindness or sheer ignorance, to the counter-privacy model of internet consumerism. How then should we feel about cloud gaming?

Is this another tidal wave?

The appeal of cloud gaming is instant: It offers users high-powered computing without the computing limits of personal hardware. But it also has another appeal: Accessibility. By "fall 2020, cloud gaming services [were still] largely unavailable to those outside North America and Central Europe." https://project-paladin.org/.. Cloud gaming is not without implementation issues. Both "interaction latency" with respect to inputs and "streaming quality" of outputs. https://ieeexplore.ieee.org/document/6574660. If these challenges are overcome, users could experience a quality gaming experience without the upfront cost of a personal machine. This could doom the less privileged to a subscription model, conscripting them into a system where even their own accolades are not their own.

Assume away the subjugation of these acolytes for a moment. This may be a more reasonable assumption than it appears. Researchers at the University of Michigan in 2020 developed a method of cloud gaming, leveraging Moonlight and Google Cloud Platform (GCP), that delivered to users computer gaming at "approximately 60¢/hour when in use," after "approximately 500 hours" of free cloud gaming for the first three months of use. https://johnragone.medium.com/500-hours-of-free-4k-60-fps-cloud-gaming-with-gcp-and-moonlight-c796fa10f0a3.

There is still a problem. Cloud gaming allows "less powerful computational devices that are otherwise incapable of running high-quality games," e.g., a simple computer or even a smart phone, run such "high-quality games." https://ieeexplore.ieee.org/document/6574660. Just like Netflix, GCP, NVIDIA "GeForce Now," or whoever is providing the more powerful "computational device" knows what it is the user is doing. NVIDIA, too, has sent the cursor to battle. It "fights [not] for the users." https://www.imdb.com/title/tt0084827/quotes/?item=qt0406294.

Were you to investigate academic treatments of privacy concerns vis a vis clouding gaming, or even computing generally, password security and data breaches would occupy your field. Netflix musters their "password strength," power to ensure your data is safely in their hands for their use. Cloud gamers may expect the same passionate concerns from their providers, too. In many respects, Netflix may stand in for cloud gaming companies. "Cloud gaming companies collect and store a large amount of personal data from users, including their gaming preferences, purchase history, and personal information." https://www.lexology.com/library/detail.aspx?g=07715def-72bb-4181-88ba-2a9c6fa0646b. Consider again, the Netflix analog. When users had to order DVDs, Netflix's development of user preferences required at least some physical interaction; now, they even know what movies you mull over but ultimately pass on. This is a disaster.

A few television streaming services held out with respect to advertising until quite recently. Missing this revenue proved too much for even the most dutiful companies. Cloud gaming services are unsurprisingly doing the same. See https://www.thurrott.com/games/298440/nvidia-geforce-now-free-tier-is-getting-pre-roll-ads. When state legislatures as early as 2008 were passing laws regulating the content of video games because of their alleged harmful effects on the users, governments and private citizens should be just as alarmed with the sort of "personalized," predatory and targeted, advertising. See https://www.cga.ct.gov/2008/rpt/2008-R-0233.htm#:~:text=Several%20states%2C%20including%20California%2C%20Georgia,sale%20of%20such%20video%20games. If there is even an ounce of truth in the "social media is killing free will" mantra, then we should all be alarmed. See https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9597644/.

To attempt a clamp down on these breaches or return to the early years, when cloud gaming was small, bifurcated into a hobbyist faction and a nascent commercial industry, think NVIDIA "GeForce Now." Both were cottage industries. This is assuredly an either-or fallacy. Recall the researchers at University of Michigan who built a do-it-yourself free cloud gaming implementation, that was "applauded internationally by [several thousand] cloud gaming enthusiasts from Latin America to Singapore." https://project-paladin.org/. It appears then that the cloud gaming industry might be primed for a grassroots implementation of a more private cloud gaming deployment. The success of such an effort relies on widespread adoption, typically by word of mouth or other nonmainstream channels. Leveraging the amateur cloud gaming subculture, the cursor's power to observe and control could be minimized. Interested researchers would be wise to devote effort to this area.


AnthonyFikryFirstPaper 13 - 01 Mar 2024 - Main.AndreasLeptos
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
test

AnthonyFikryFirstPaper 12 - 01 Mar 2024 - Main.RobertLarese
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<
>
>
test

AnthonyFikryFirstPaper 11 - 21 Feb 2024 - Main.RobertLarese
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"

AnthonyFikryFirstPaper 10 - 27 Mar 2022 - Main.MirelisValle
Line: 1 to 1
Changed:
<
<
META TOPICPARENT name="ArchivedMaterial"
Section 220's Privacy Problem Ro Reynolds
>
>
META TOPICPARENT name="FirstPaper"
 
Deleted:
<
<
Modern technology has created a perhaps untenable environment in boardrooms. It is virtually impossible to truly delete anything that has been typed, emailed, texted, etc. Delaware corporate law, and particularly the development of Section 220 claims, has created an enormous privacy issue for directors, as now courts can order electronic records to be produced during litigation. Without an expectation of privacy, or even a possibility of such, directors will not be free to openly voice their opinions, thus hamstringing their roles in the boardroom. In 1985, in Smith v. Van Gorkom [https://casetext.com/case/smith-v-van-gorkom]], a major case in Delaware corporate law, the Delaware Supreme Court found that directors could be held personally liable if they act on an uniformed basis. Given the magnitude of the transaction, directors had the obligation to ask questions and to make themselves informed. By not doing so, they violated their duty of care by being uninformed, and could not hide behind the business judgment rule. This holding was rather controversial because it created incentives to avoid serving on boards. Indeed, before Van Gorkom, there were virtually no cases of director liability for breach of duty of care because, from a shareholder's perspective, one would not want their directors to be liable for the full cost of a misstep. If they were, directors would never do anything risky, and thus nothing profitable. Shareholders could always diversify their risk, perhaps through holding a diverse portfolio. What they did not want was their directors acting like their surgeons. So, in many ways Van Gorkom served as a wakeup call. Boards could not be treated like social clubs, and directors could no longer act as if their duty was just to support the CEO. Serving on a board of directors would become more of a commitment. Despite the passage of DGCL §102(b)(7), which validated charter amendments that provide a corporate director has no liability for losses caused by transactions in which the director has no conflicting financial interest or otherwise violated the duty of loyalty, board processes improved significantly. Over the next two decades, board composition improved as well by including directors for more reason than a friendship with the CEO. The improved processes and transparency on boards as a result of the case have created a privacy problem however. Plaintiffs' lawyers now can file Section 220 claims to obtain corporate records and documents for a proper purpose, typically a breach of fiduciary duty claim. These Section 220 claims are not limited to books and records, as a recent Delaware Supreme Court decision held that electronic records and communications could be fair game. While records produced as part of Section 220 claims are generally kept non-public, ultimately only confidential information such as trade secrets will be kept under seal. Corporations now face the issue that anything typed cannot be destroyed. Historically, when notes were taken during a board meeting, they could always be shredded or otherwise destroyed. Yet today, due to modern technology, even if a director deletes something they have typed, or versions up a document, that original can likely still be found and recovered. Corporations and their directors must now contend with the increased liability for directors that results from the fact that any electronic record from notes to text messages could be revealed. The privacy issue is significant enough that major law firms have published guidance on their websites for directors. Yet, even these firms have not been able to produce any sort of elegant solution to protecting directors' privacy. Instead, the guidance boils down to advising against sending emails and text messages discussing material matters, and scheduling calls and meetings for substantive matters. Perhaps most poignant, one article on Skadden's website suggests the following: A good rule of thumb, before texting or emailing, is to ask, "Would you want to read this in a newspaper?" [https://www.skadden.com/insights/publications/2021/10/the-informed-board/this-isnt-your-grandparents-books-and-records#:~:text=Section%20220%20of%20the%20Delaware,by%20the%20board%20or%20management.]] Directors today would do well to remember Melvin Gross v. Biogen Inc., which limited the plaintiff to inspecting board-level materials on the grounds that, "[t]hese documents and communications will enable Plaintiff to assess the extent to which Board members were made aware of the alleged wrongdoing and to evaluate how the Board members responded to the Investigation." In essence, Section 220 claims are subject to some restraints and will not guarantee access to corporate records, emails, and texts if the formal board-level materials exists, are available, and would satisfy a plaintiff's proper purpose and demand. So, if directors can refrain from conducting business over email, text, and other informal channels, their electronic communications will not be subject to inspection. "Just don't do it" is of course easier said than done in today's hyper-connected world. Business is conducted through texts and emails, imprudent as it may be. And, while directors may sacrifice practicality for the sake of privacy and avoid emails discussing sensitive matters, they will be hard pressed to avoid keyboards altogether. Even if someone were to delete an improper comment, sentence, or entire document typed on a laptop for example, those words will still be recoverable and thus theoretically subject to a Section 220 claim. As noted by David Katz, in order for a board to function properly and fulfill its role, directors must be able to express their thoughts and opinions freely without fear that they will be made [https://corpgov.law.harvard.edu/2014/01/23/boardroom-confidentiality-under-focus/]]. If they cannot (and indeed they cannot for Section 220 provides the legal hook with which to access materials computers prevent from being completely private), then either boards will cease to function effectively, people will be less likely to serve as directors, or those who do will be subject to increased liability.

AnthonyFikryFirstPaper 9 - 23 Mar 2022 - Main.MylesAmbrose
Line: 1 to 1
 
META TOPICPARENT name="ArchivedMaterial"
Section 220's Privacy Problem Ro Reynolds

AnthonyFikryFirstPaper 8 - 15 Mar 2022 - Main.JoseOtero
Line: 1 to 1
 
META TOPICPARENT name="ArchivedMaterial"
Section 220's Privacy Problem Ro Reynolds

AnthonyFikryFirstPaper 7 - 12 Mar 2022 - Main.HiroyukiTanaka
Line: 1 to 1
 
META TOPICPARENT name="ArchivedMaterial"
Section 220's Privacy Problem Ro Reynolds

AnthonyFikryFirstPaper 6 - 08 Mar 2022 - Main.RoReynolds
Line: 1 to 1
Changed:
<
<
META TOPICPARENT name="FirstPaper"
>
>
META TOPICPARENT name="ArchivedMaterial"
Section 220's Privacy Problem Ro Reynolds
 
Added:
>
>
Modern technology has created a perhaps untenable environment in boardrooms. It is virtually impossible to truly delete anything that has been typed, emailed, texted, etc. Delaware corporate law, and particularly the development of Section 220 claims, has created an enormous privacy issue for directors, as now courts can order electronic records to be produced during litigation. Without an expectation of privacy, or even a possibility of such, directors will not be free to openly voice their opinions, thus hamstringing their roles in the boardroom. In 1985, in Smith v. Van Gorkom [https://casetext.com/case/smith-v-van-gorkom]], a major case in Delaware corporate law, the Delaware Supreme Court found that directors could be held personally liable if they act on an uniformed basis. Given the magnitude of the transaction, directors had the obligation to ask questions and to make themselves informed. By not doing so, they violated their duty of care by being uninformed, and could not hide behind the business judgment rule. This holding was rather controversial because it created incentives to avoid serving on boards. Indeed, before Van Gorkom, there were virtually no cases of director liability for breach of duty of care because, from a shareholder's perspective, one would not want their directors to be liable for the full cost of a misstep. If they were, directors would never do anything risky, and thus nothing profitable. Shareholders could always diversify their risk, perhaps through holding a diverse portfolio. What they did not want was their directors acting like their surgeons. So, in many ways Van Gorkom served as a wakeup call. Boards could not be treated like social clubs, and directors could no longer act as if their duty was just to support the CEO. Serving on a board of directors would become more of a commitment. Despite the passage of DGCL §102(b)(7), which validated charter amendments that provide a corporate director has no liability for losses caused by transactions in which the director has no conflicting financial interest or otherwise violated the duty of loyalty, board processes improved significantly. Over the next two decades, board composition improved as well by including directors for more reason than a friendship with the CEO. The improved processes and transparency on boards as a result of the case have created a privacy problem however. Plaintiffs' lawyers now can file Section 220 claims to obtain corporate records and documents for a proper purpose, typically a breach of fiduciary duty claim. These Section 220 claims are not limited to books and records, as a recent Delaware Supreme Court decision held that electronic records and communications could be fair game. While records produced as part of Section 220 claims are generally kept non-public, ultimately only confidential information such as trade secrets will be kept under seal. Corporations now face the issue that anything typed cannot be destroyed. Historically, when notes were taken during a board meeting, they could always be shredded or otherwise destroyed. Yet today, due to modern technology, even if a director deletes something they have typed, or versions up a document, that original can likely still be found and recovered. Corporations and their directors must now contend with the increased liability for directors that results from the fact that any electronic record from notes to text messages could be revealed. The privacy issue is significant enough that major law firms have published guidance on their websites for directors. Yet, even these firms have not been able to produce any sort of elegant solution to protecting directors' privacy. Instead, the guidance boils down to advising against sending emails and text messages discussing material matters, and scheduling calls and meetings for substantive matters. Perhaps most poignant, one article on Skadden's website suggests the following: A good rule of thumb, before texting or emailing, is to ask, "Would you want to read this in a newspaper?" [https://www.skadden.com/insights/publications/2021/10/the-informed-board/this-isnt-your-grandparents-books-and-records#:~:text=Section%20220%20of%20the%20Delaware,by%20the%20board%20or%20management.]] Directors today would do well to remember Melvin Gross v. Biogen Inc., which limited the plaintiff to inspecting board-level materials on the grounds that, "[t]hese documents and communications will enable Plaintiff to assess the extent to which Board members were made aware of the alleged wrongdoing and to evaluate how the Board members responded to the Investigation." In essence, Section 220 claims are subject to some restraints and will not guarantee access to corporate records, emails, and texts if the formal board-level materials exists, are available, and would satisfy a plaintiff's proper purpose and demand. So, if directors can refrain from conducting business over email, text, and other informal channels, their electronic communications will not be subject to inspection. "Just don't do it" is of course easier said than done in today's hyper-connected world. Business is conducted through texts and emails, imprudent as it may be. And, while directors may sacrifice practicality for the sake of privacy and avoid emails discussing sensitive matters, they will be hard pressed to avoid keyboards altogether. Even if someone were to delete an improper comment, sentence, or entire document typed on a laptop for example, those words will still be recoverable and thus theoretically subject to a Section 220 claim. As noted by David Katz, in order for a board to function properly and fulfill its role, directors must be able to express their thoughts and opinions freely without fear that they will be made [https://corpgov.law.harvard.edu/2014/01/23/boardroom-confidentiality-under-focus/]]. If they cannot (and indeed they cannot for Section 220 provides the legal hook with which to access materials computers prevent from being completely private), then either boards will cease to function effectively, people will be less likely to serve as directors, or those who do will be subject to increased liability.

AnthonyFikryFirstPaper 5 - 08 Mar 2020 - Main.MargaretCaison
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Deleted:
<
<
RECENT REMARKABLE DECISION BY THE SUPREME COURT OF JAPAN CONCERNING THE "RIGHT TO BE FORGOTTEN"
 
Deleted:
<
<
It is obvious today that search engines play a crucial role in information distribution on the Internet. In the meantime, however, the semi-permanence of and easy accessibility to each online expression have increasingly caused severe exposure of one’s privacy online. In light of this situation, the European Court of Justice held in Google Spain v. AEPD(*1), the landmark case in this legal arena, that a citizen had a right to privacy in his past economic troubles, accepting in effect the “right to be forgotten”, followed by the effectuation of the General Data Protection Rule, which explicitly stipulates the “right to erasure”. In contrast, the United States has historically put greater emphasis on freedom of speech than privacy(*2). Speaking of Japan, many decisions by lower courts had been made since 2014 in response to claims against search engine operators for removal of particular search results which contained plaintiffs’ privacy information, but the decisions by the courts were divided over this issue until the first decision by the Supreme Court (the “Decision”)(*3) was made in January 2017. In the case presented to the Supreme Court, the plaintiff claimed against Google for the removal of the search results (titles and snippets) of the websites which had uncovered his arrest record for child prostitution and pornography three years prior to the filing of lawsuit. The court rejected the claim in conclusion, and therefore, the Decision was widely broadcast as “the first decision by the Supreme Court which refused the right to be forgotten”. Below are some of the key points of this landmark decision in Japan.

First of all, the court showed, in its reasoning, one answer to the issue of whether (and in what sense) search engine operators are making their own expressions. Prior to the Decision, there were several decisions made by the lower courts which stated that search engine operators did not make any expression, primarily because the whole process of collection, organization and supply of online information is done automatically by certain computer programs. This opinion led to the holding by those courts that the operators shall not be responsible for the removal of search results. To the contrary, the Supreme Court held in the Decision that the service provided by online search operators has an aspect of “expression by search engine operators themselves”, reasoning that a computer program used by an online search operator is prepared so that the search results are created in accordance with its own policy. The court goes on to state that, if a search engine operator is forced to remove a specific search results, then it should be regarded as “constraint of expression which is consistent in accordance with its policy” regarding provision of search results. In this way, the Supreme Court not only denied the opinion of the several lower courts, but also proposed its understanding of “expressions” made by online search operators differently from traditional mass media, which conducts news gathering and editing.

Another notable point in the reasoning is that the court gives great weight to the function of search engines. Although the court did not make clear reference to the “right to be informed” by the citizens, it stated that the service provided by search engine operators played an important role as infrastructure in online information flow, by supporting the public to pick up necessary tips from massive amount of information on the Internet. This consideration presumably had substantial impact in the ruling (as mentioned in the next paragraph).

Next, the Supreme Court ruled based on the general legal framework governing privacy rights. As to civil claims related to violation of privacy by print press (such as claim for damages or injunction against publication containing one’s privacy), Japanese courts have established the rule that the claim is upheld only if privacy protection concerns outweigh the public interest in information disclosure. The Decision, essentially based on this traditional legal framework, set a higher hurdle of “…concerns clearly outweigh the public interest…”, adding the word “clearly” to the normal standard. The court applied this standard, which is more advantageous to search engine operators than the normal test, since the court put great emphasis on the important function of search engines, as well as on the importance of freedom of expression (as mentioned in the preceding paragraphs).

Finally, unlike Google Spain, the Supreme Court of Japan did not make any reference to the “right to be forgotten”. Instead, the court regarded the interest of the plaintiff as the “interest not to be disclosed of facts which is within the scope of one’s privacy”, without making any reference to the “right to be forgotten”. However, the Decision still seems similar to Google Spain in understanding that the right to privacy (or “right to be forgotten”) is not absolute and should always be balanced against the fundamental rights or interests, which requires careful case-by-case analysis. In other words, whether the court upholds the claim highly depends on the facts of each case, both in the EU and Japan.

As mentioned above, the Decision not only established the legal framework applied in the same type of dispute, but also presented the understanding by the Supreme Court of search engines in light of the freedom of speech and the relevant issues. How to apply the above-mentioned standard to each case is, however, still open to a considerable extent and remains to be discussed by the courts in the near future.

(*1) ECJ, Case C-131/12, Google Spain v. AEPD (2014). (*2) Andrew Neville, Is It a Human Right To Be Forgotten? Conceptualizing The World View, 15 Santa Clara J. Int'l L. 157 (2017), at 167. (*3) Supreme Court, Judgement, January 31, 2017, 1669 Sai-Ji 1.

 \ No newline at end of file

AnthonyFikryFirstPaper 4 - 25 Apr 2018 - Main.IgnacioSaldana
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
RECENT REMARKABLE DECISION BY THE SUPREME COURT OF JAPAN CONCERNING THE "RIGHT TO BE FORGOTTEN"

AnthonyFikryFirstPaper 3 - 19 Apr 2018 - Main.TrippOdom
Line: 1 to 1
Changed:
<
<
META TOPICPARENT name="WebPreferences"
>
>
META TOPICPARENT name="FirstPaper"
 RECENT REMARKABLE DECISION BY THE SUPREME COURT OF JAPAN CONCERNING THE "RIGHT TO BE FORGOTTEN"

It is obvious today that search engines play a crucial role in information distribution on the Internet. In the meantime, however, the semi-permanence of and easy accessibility to each online expression have increasingly caused severe exposure of one’s privacy online. In light of this situation, the European Court of Justice held in Google Spain v. AEPD(*1), the landmark case in this legal arena, that a citizen had a right to privacy in his past economic troubles, accepting in effect the “right to be forgotten”, followed by the effectuation of the General Data Protection Rule, which explicitly stipulates the “right to erasure”. In contrast, the United States has historically put greater emphasis on freedom of speech than privacy(*2). Speaking of Japan, many decisions by lower courts had been made since 2014 in response to claims against search engine operators for removal of particular search results which contained plaintiffs’ privacy information, but the decisions by the courts were divided over this issue until the first decision by the Supreme Court (the “Decision”)(*3) was made in January 2017. In the case presented to the Supreme Court, the plaintiff claimed against Google for the removal of the search results (titles and snippets) of the websites which had uncovered his arrest record for child prostitution and pornography three years prior to the filing of lawsuit. The court rejected the claim in conclusion, and therefore, the Decision was widely broadcast as “the first decision by the Supreme Court which refused the right to be forgotten”. Below are some of the key points of this landmark decision in Japan.


AnthonyFikryFirstPaper 2 - 16 Apr 2018 - Main.TakahiroOishi
Line: 1 to 1
 
META TOPICPARENT name="WebPreferences"
Added:
>
>
RECENT REMARKABLE DECISION BY THE SUPREME COURT OF JAPAN CONCERNING THE "RIGHT TO BE FORGOTTEN"
 
Added:
>
>
It is obvious today that search engines play a crucial role in information distribution on the Internet. In the meantime, however, the semi-permanence of and easy accessibility to each online expression have increasingly caused severe exposure of one’s privacy online. In light of this situation, the European Court of Justice held in Google Spain v. AEPD(*1), the landmark case in this legal arena, that a citizen had a right to privacy in his past economic troubles, accepting in effect the “right to be forgotten”, followed by the effectuation of the General Data Protection Rule, which explicitly stipulates the “right to erasure”. In contrast, the United States has historically put greater emphasis on freedom of speech than privacy(*2). Speaking of Japan, many decisions by lower courts had been made since 2014 in response to claims against search engine operators for removal of particular search results which contained plaintiffs’ privacy information, but the decisions by the courts were divided over this issue until the first decision by the Supreme Court (the “Decision”)(*3) was made in January 2017. In the case presented to the Supreme Court, the plaintiff claimed against Google for the removal of the search results (titles and snippets) of the websites which had uncovered his arrest record for child prostitution and pornography three years prior to the filing of lawsuit. The court rejected the claim in conclusion, and therefore, the Decision was widely broadcast as “the first decision by the Supreme Court which refused the right to be forgotten”. Below are some of the key points of this landmark decision in Japan.
 
Changed:
<
<
-- ZheYang - 16 Apr 2018
>
>
First of all, the court showed, in its reasoning, one answer to the issue of whether (and in what sense) search engine operators are making their own expressions. Prior to the Decision, there were several decisions made by the lower courts which stated that search engine operators did not make any expression, primarily because the whole process of collection, organization and supply of online information is done automatically by certain computer programs. This opinion led to the holding by those courts that the operators shall not be responsible for the removal of search results. To the contrary, the Supreme Court held in the Decision that the service provided by online search operators has an aspect of “expression by search engine operators themselves”, reasoning that a computer program used by an online search operator is prepared so that the search results are created in accordance with its own policy. The court goes on to state that, if a search engine operator is forced to remove a specific search results, then it should be regarded as “constraint of expression which is consistent in accordance with its policy” regarding provision of search results. In this way, the Supreme Court not only denied the opinion of the several lower courts, but also proposed its understanding of “expressions” made by online search operators differently from traditional mass media, which conducts news gathering and editing.
 
Added:
>
>
Another notable point in the reasoning is that the court gives great weight to the function of search engines. Although the court did not make clear reference to the “right to be informed” by the citizens, it stated that the service provided by search engine operators played an important role as infrastructure in online information flow, by supporting the public to pick up necessary tips from massive amount of information on the Internet. This consideration presumably had substantial impact in the ruling (as mentioned in the next paragraph).
 
Deleted:
<
<
 
<--/commentPlugin-->
 \ No newline at end of file
Added:
>
>
Next, the Supreme Court ruled based on the general legal framework governing privacy rights. As to civil claims related to violation of privacy by print press (such as claim for damages or injunction against publication containing one’s privacy), Japanese courts have established the rule that the claim is upheld only if privacy protection concerns outweigh the public interest in information disclosure. The Decision, essentially based on this traditional legal framework, set a higher hurdle of “…concerns clearly outweigh the public interest…”, adding the word “clearly” to the normal standard. The court applied this standard, which is more advantageous to search engine operators than the normal test, since the court put great emphasis on the important function of search engines, as well as on the importance of freedom of expression (as mentioned in the preceding paragraphs).

Finally, unlike Google Spain, the Supreme Court of Japan did not make any reference to the “right to be forgotten”. Instead, the court regarded the interest of the plaintiff as the “interest not to be disclosed of facts which is within the scope of one’s privacy”, without making any reference to the “right to be forgotten”. However, the Decision still seems similar to Google Spain in understanding that the right to privacy (or “right to be forgotten”) is not absolute and should always be balanced against the fundamental rights or interests, which requires careful case-by-case analysis. In other words, whether the court upholds the claim highly depends on the facts of each case, both in the EU and Japan.

As mentioned above, the Decision not only established the legal framework applied in the same type of dispute, but also presented the understanding by the Supreme Court of search engines in light of the freedom of speech and the relevant issues. How to apply the above-mentioned standard to each case is, however, still open to a considerable extent and remains to be discussed by the courts in the near future.

(*1) ECJ, Case C-131/12, Google Spain v. AEPD (2014). (*2) Andrew Neville, Is It a Human Right To Be Forgotten? Conceptualizing The World View, 15 Santa Clara J. Int'l L. 157 (2017), at 167. (*3) Supreme Court, Judgement, January 31, 2017, 1669 Sai-Ji 1.


AnthonyFikryFirstPaper 1 - 16 Apr 2018 - Main.ZheYang
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="WebPreferences"

-- ZheYang - 16 Apr 2018

 
<--/commentPlugin-->

Revision 20r20 - 27 Apr 2024 - 19:38:30 - EbenMoglen
Revision 19r19 - 06 Apr 2024 - 13:22:55 - AnthonyFikry
Revision 18r18 - 05 Mar 2024 - 15:59:03 - MorganC
Revision 17r17 - 03 Mar 2024 - 17:23:59 - AvrahamTsikhanovski
Revision 16r16 - 01 Mar 2024 - 20:53:23 - AndersonDalmeus
Revision 15r15 - 01 Mar 2024 - 17:33:14 - HafsahHanif
Revision 14r14 - 01 Mar 2024 - 12:58:08 - RobertLarese
Revision 13r13 - 01 Mar 2024 - 08:26:48 - AndreasLeptos
Revision 12r12 - 01 Mar 2024 - 02:09:37 - RobertLarese
Revision 11r11 - 21 Feb 2024 - 13:46:01 - RobertLarese
Revision 10r10 - 27 Mar 2022 - 00:08:21 - MirelisValle
Revision 9r9 - 23 Mar 2022 - 17:55:55 - MylesAmbrose
Revision 8r8 - 15 Mar 2022 - 13:59:46 - JoseOtero
Revision 7r7 - 12 Mar 2022 - 00:08:03 - HiroyukiTanaka
Revision 6r6 - 08 Mar 2022 - 17:06:06 - RoReynolds
Revision 5r5 - 08 Mar 2020 - 01:46:58 - MargaretCaison
Revision 4r4 - 25 Apr 2018 - 19:23:55 - IgnacioSaldana
Revision 3r3 - 19 Apr 2018 - 01:22:01 - TrippOdom?
Revision 2r2 - 16 Apr 2018 - 06:34:24 - TakahiroOishi
Revision 1r1 - 16 Apr 2018 - 05:38:07 - ZheYang
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM