Law in the Internet Society

A Tale from the Trenches

-- By GraceHopper - 04 Oct 2023

A Revolutionary Act

In 2021 I viewed my last social media post. It was a picture of a high school friend’s baby. For a brief second, I felt a prickle of oxytocin. I shared a moment of closeness with this stranger I have not spoken to in over a decade. And then the wash of anxiety came. The baby wore matching pink clothes in a crystal clean room, starkly contrasted to the collection of empty Monster cans on my desk. Should I post something? Maybe with a picture of a can of Monster and code running in the background, I could cosplay as a hacker in a spy thriller, with the caption “living the dream”? Would people respond? And then I remembered why I sat there alone in a dark room. I logged off social media that night and did not sign back in for a long time. In late 2022 I logged into my social media accounts for the last time to request that they delete all my data. Whether or not the companies did so is immaterial to the fact that for the first time since childhood I broke free from the Silicon Trap that held my mind captive. From firewalls to VPNs to strictly segmenting my online identity on various VMs and phones, I took steps to protect my mind from the parasite.

I like to consider myself as much of a philosopher as a computer nerd, but I’m not exceptional at either. I leave the philosophers to wax poetic about the meaning of freedom and the computer nerds with their Monsters to code. Instead, I will explain a piece of the parasite, the threat it poses, and muse on paths forward.

A Piece of the Parasite

The recommendation engines used by tech companies are open source. The companies do not try to hide it. They post papers on their research sites boasting that this is what we do, this is how we get you.

Recommendation engines have three main components: the content, your data, and the model.

The Content.

The first is a matrix of content on the social media platform. This content is a mix of advertisements, posts created by real humans, and Russian bots. This content is categorized through classification models, such as tone (you can quantify love), content (a political post versus puppies), and purpose (a call to action versus a recommendation).

Your Data.

The second is a matrix of user profiles. Data scientists are hungry for any information, however minute, on you. From your browsing history, the locations you’ve visited, search histories in other applications, even when you take your daily bowel movement, we want it all. And generally, we have it all. Surveillance capitalism took the wheel and runs rampant. That site you scrolled past the terms and conditions to read about Hailey Bieber’s new nail color took your data and packaged it into 24-hour, 48-hour, 1-week, and three-week intervals for the convenience of data teams worldwide.

The Model.

The third component is the model. The first two parts, the content and your data, go into the model to train it and to output recommendations. The model is optimized on clicks (a measure of immediate engagement) and long-term engagement (“stickiness” is the industry term, a euphonism for addictiveness). The more engagement an algorithm generates, the more money it makes, and more successful it’s considered.

The model is further reinforced through “topic clustering”. Topic clustering creates groups of users that behave a certain way (“cluster”) and then “discovering” other users that look like the “cluster” and surfacing recommendations to these users as well. These clusters generally revolve around the classifications of the content, so multiple clusters can encompass a single user, creating a unique identity of the user for the parasite and derivative clusters. Topic clustering industrializes the recommendation engine and crystalizes the silicon cage, saving precious memory and computer processing power to keep people happily and stupidly clicking within their clusters.

The Threat

When people say they have nothing to hide, that the parasite can have their data, I first laugh then cringe. People don’t understand that every piece of personal data given to feed this engine chips away at the freedom of choice. The more information the model has, the better recommendations it produces and the narrower the results it gives. The recommendation engine transforms our online choices from a cornucopia of ideas to slightly different flavors of the same thing. Social media news feeds whittle the mind to succumb to the tyranny of the masses. Freedom of choice is reduced to a narrow band of prepackaged recommendations that the engine is statically sure we will click on and get addicted to. Advertisements sell products and this money gets pumped into data science teams to optimize attention to present the advertisements with products that will sell. In this world, what does that mean for freedom of thought?

Even more nefarious is topic clustering. The more information the model has on users, more accurate the cluster it places the users into. Attention stealing and addiction happen en masse as companies target entire clusters to sell to. I do not have to ask what happens when bad actors utilize this power when Cambridge Analytica and the 2016 election already have.

A Path Forward

For every byte of data given to the parasite, a piece of freedom succumbs to its voracious appetite. We must cut ties with the parasite and reclaim our minds. Once we take that revolutionary step, then we strategize. Can tort law hold social media companies responsible for the harms that they cause? Or can contract law protect as consideration the unbridled use of our data? What role does antitrust law have in this Brave New World? Patent Law? Copyright? Or do we need to place a cratering charge in the heart of the parasite and build a new? I do not have answers, only musings, an appetite for knowledge, and a stomach to say “fuck you” to power.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r1 - 05 Oct 2023 - 19:32:44 - GraceHopper
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM