Computers, Privacy & the Constitution

View   r6  >  r5  >  r4  >  r3  >  r2  >  r1
HafsahHanifFirstPaper 6 - 09 May 2024 - Main.HafsahHanif
Changed:
<
<
Revision 5 is unreadable
>
>
META TOPICPARENT name="FirstPaper"

Searching for Truth: The Illusion of Search Engines

-- By HafsahHanif - 01 Mar 2024

Introduction

With approximately seventy percent of the global population using the internet, search engines hold immense power in today’s society. Search engines, however, are merely the product of algorithms and computations that may be manipulated by their programmers and users. While search engines are highly susceptible to manipulation and the dissemination of misinformation, their creators often evade responsibility for the harm they cause because the First Amendment protects their creators.

Search Engines Are Powerful but Dangerous Tools

Search engines are no longer––if they ever were––analogous to a library that stores fact-checked or credible sources. Search engines have become a tool for the propagation of propaganda and misinformation, catering to the highest bidder and providing curated results to increase user engagement. While users may have a duty to verify the results provided by search engines, this does not mean that we should assume away responsibility from search engines for the damage and violence their results may have. Where the burden is on the user and never on the service provider to do the heavy lifting, we risk creating a dangerous precedent where accountability is conveniently shifted away from those who have the power to control the dissemination of information. By absolving search engines of responsibility for the consequences of their algorithms, we effectively allow them to operate with impunity, prioritizing profit over truth and social well-being. This not only undermines legal principles of accountability but also exacerbates the proliferation of misinformation and divisive content.

The case of convicted murderer Dylann Roof underscores the profound impact of online platforms' propaganda in shaping extremist ideologies and inciting violence. Trying to make sense of the Trayvon Martin case, Roof searched “black on White crime” on Google. The first result Roof saw was from the white nationalist Council of Conservative Christians; these views ultimately motivated his plan to massacre Black civilians in an attempt to start a race war.

Roof’s case begs the question of why and how such extremist propaganda and unreliable information was not only disseminated but also highly recommended by Google’s search engine. Many believe that the ranking system employed by Google is an indicator of reliability and credibility. Indeed, research has shown that people often only look at the first page of results produced by a search engine. The reality, however, is that the top-ranking sites can be manipulated or curated to reflect the biases of the user. These websites may belong to those who can pay premium prices to have their site appear higher on the results page or those who have managed to manipulate the algorithm through search engine optimization tactics. Moreover, the algorithms underlying search engines are built to rely on collected data and web history. Using this collected information, search engines learn what the user’s preferences are and then direct them toward content that confirms or reaffirms their beliefs and biases.

First Amendment Protection

Despite the dangers posed by search engines, these systems continue to enjoy protection under the First Amendment—a notion that has gained judicial support in recent years. Google’s search algorithm has historically “fail[ed] to find the middle ground between protecting free speech and protecting its users from misinformation.” The sentiment underlying this protection is that search results reflect the “individual editorial choices” of the company—a form of speech that is protected by the First Amendment. By “select[ing] what information it presents and how it presents it,” Google is exercising its right to free speech in a manner akin to that of a newspaper or forum editor—all without government interference.

But this argument is inherently flawed. The information presented by Google’s search engine is not necessarily reflective of the company’s “editorial choices”; rather, an algorithm developed by Google but designed to learn through user engagement is often making “editorial choices” when determining which results to produce for a given search. As deeply flawed as human beings are, algorithms are not only far more susceptible to error but also lack judgment and the ability to discern right from wrong. While programmers may install safeguards to train algorithms to detect unreliable information, ultimately these protections are easily circumventable. By extending First Amendment protections to an artificially intelligent algorithm, courts have diminished the core principles of free speech and eroded human agency and accountability in the exercise of free speech.

Privacy Concerns

Search engines also raise concerns about individual privacy. Companies such as Google, which have established their brand around the prowess of their search engine, meet the demands of their users by curating results to provide relevant and personalized information tailored to individual preferences and search histories. While users believe search engines are providing them with information that is useful to them, the reality is that the algorithm is “steer[ing] them toward content that confirms their likes, dislikes and biases.”

In the most extreme cases, such reinforcement can contribute to the radicalization of individuals, exemplified by cases like that of Dylann Roof, who became immersed in a spiral of extremist ideologies. Most often, however, the concern with collecting web history and data is that search engines sell this information to advertisers and other third-party companies to generate profit. Google uses personal data in two ways: “It uses data to build individual profiles with demographics and interests, then lets advertisers target groups of people based on those traits” and “[i]t shares data with advertisers directly and asks them to bid on individual ads.” In both instances, user privacy and protection are diminished as users are directed toward specific websites, ultimately undermining individual autonomy to serve the interests of a data-driven system.

Conclusion

As gatekeepers of digital knowledge, search engines wield significant power in shaping societies and influencing decisions. Search engines are often perceived as endless libraries containing a wealth of reliable information. In reality, search engines are highly manipulated systems that influence human behavior often without detection. By allowing search engines and internet companies to evade responsibility, we risk undermining the integrity of online informational ecosystems and compromising the privacy and rights of users.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


HafsahHanifFirstPaper 5 - 08 May 2024 - Main.HafsahHanif
Added:
>
>
Revision 5 is unreadable
Deleted:
<
<
META TOPICPARENT name="FirstPaper"

[WORK IN PROGRESS] Searching for Truth: The Illusion of Search Engines

-- By HafsahHanif - 01 Mar 2024

[I'm so sorry, I made an edit to my paper and it has moved to the top of the list. I'm still working on it and will have it ready by May 10!]

Introduction

Google, one of the most popular search engines today, has expanded into a brand, browser, and verb. Among the first information retrieval services to incorporate the ranked relevance algorithm, the company has made its name synonymous with the act of searching, making it seem invincible, omniscient, and apolitical. Search engines, however, are merely the product of algorithms and computations that may be manipulated by their programmers and users. While search engines are highly susceptible to manipulation and the dissemination of misinformation, their creators often evade responsibility for the harm they cause because their creators are protected by the First Amendment.

Search Engines Are Powerful but Dangerous Tools

With approximately seventy percent of the global population using the internet, search engines, the backbone of internet use, hold immense power in today’s society. While they are a convenient tool for individuals to quickly learn about a variety of straightforward topics, search engines also serve as an outlet for people to ask difficult or otherwise unacceptable questions and validate personal assumptions. Yet, the results retrieved through these services are not fact-checked or necessarily credible sources.

To better understand why a search engine should be considered at least partially responsible for the impact it may have on individuals, let us consider an analogous situation using conventional libraries. Assume someone interested in learning more about race-based violence ventured into a section in the library about social issues. If the shelves were structured such that the most prominent books placed at the front were written by alt-right, racist individuals who believe that the white race is superior to all others, while hidden at the very back of the shelves, behind all of the propaganda and misinformation, were well-substantiated books about the reality and impact of segregation, forced poverty, and police brutality, then we would likely first turn to the library as the culprit in allowing such misinformation to be spread. And if the alt-right groups had paid for their books to be placed far ahead of others, this would be even further cause for concern. This is analogous to what we see on the internet---search engines are structured to place certain results ahead of others, whether because individuals pay to have their pages placed higher in rankings or because an underlying algorithm learns what its users are interested in, rather than what may be accurate information.

The case of convicted murderer Dylann Roof underscores the profound impact of propaganda published and promoted by online platforms in shaping extremist ideologies and inciting violence. Trying to make sense of the Trayvon Martin case, Roof searched “black on White crime” on Google. The first result Roof saw was from the white nationalist Council of Conservative Christians; these views ultimately motivated his plan to massacre Black civilians in an attempt to start a race war.

It's not clear why an index should be considered responsible for the consequences of people looking things up in it. It would be an improvement to address this question instead of assuming it away.

Roof’s case begs the question of why and how such extremist propaganda and unreliable information was not only disseminated but also highly recommended by Google’s search engine.[4] Many believe that the ranking system employed by Google is an indicator of reliability and credibility. Indeed, research has shown that people often only look at the first page of results produced by a search engine. The reality, however, is that the top-ranking sites can be manipulated or curated to reflect the biases of the user. [[https://blogs.lse.ac.uk/impactofsocialsciences/2019/06/09/book-review-algorithms-of-oppression-how-search-engines-reinforce-racism-by-safiya-umoja-noble/][These websites may belong to those who can pay premium prices to have their site appear higher on the results page or those who have managed to manipulate the algorithm through search engine optimization tactics.] Moreover, the algorithms underlying search engines are built to rely on collected data and web history. Using this collected information, search engines learn what the user’s preferences are and then direct them toward content that confirms or reaffirms their beliefs and biases.

First Amendment Protection

Despite the dangers posed by search engines, these systems continue to enjoy protection under the First Amendment—a notion that has gained judicial support in recent years. Indeed, Google’s search algorithm has historically “[[https://www.learningforjustice.org/magazine/fall-2017/the-miseducation-of-dylann-roof][fail[ed] to find the middle ground]] between protecting free speech and protecting its users from misinformation.” The sentiment underlying this protection is that search results reflect the “individual editorial choices” of the company—a form of speech that is protected by the First Amendment. By “select[ing] what information it presents and how it presents it,” Google is exercising its right to free speech in a manner akin to that of a newspaper or forum editor—all without government interference.

But this argument is inherently flawed. The information presented by Google’s search engine is not necessarily reflective of the company’s “editorial choices”; rather, an algorithm developed by Google but designed to learn through user engagement is often making “editorial choices” when determining which results to produce for a given search. As deeply flawed as human beings are, algorithms are not only far more susceptible to error but also lack judgment and the ability to discern right from wrong. While programmers may install safeguards to train algorithms on how to detect hate speech and unreliable information in order to suppress such content, ultimately these protections are easily circumventable. By extending First Amendment protections to an artificially intelligent algorithm, courts have diminished the core principles of free speech and eroding human agency and accountability in the exercise of free speech.

Privacy Concerns

Search engines also raise concerns about individual privacy. Companies such as Google, which have established their brand around the prowess of their search engine, meet the demands of their users by curating results to provide relevant and personalized information tailored to individual preferences and search histories––“[e]ach link, every time you click on something, it’s basically telling the algorithm that this is what you believe in.” While users believe search engines are providing them with information that is useful to them, the reality is that the algorithm is “[[https://www.learningforjustice.org/magazine/fall-2017/the-miseducation-of-dylann-roof][steer[ing] them]] toward content that confirms their likes, dislikes and biases.”

In the most extreme cases, such reinforcement can contribute to the radicalization of individuals, exemplified by cases like that of Dylann Roof, who became immersed in a spiral of increasingly extremist ideologies. Most often, however, the concern with collecting web history and data is that search engines sell this information to advertisers and other third party companies in order to generate profit. For example, Google uses personal data in two ways: “It uses data to build individual profiles with demographics and interests, then lets advertisers target groups of people based on those traits” and “[i]t shares data with advertisers directly and asks them to bid on individual ads.” In both instances, user privacy and protection is diminished, as individuals’ personal information is commodified and exploited for targeted advertising purposes. As a result, users find themselves directed towards specific websites or information, ultimately undermining individual autonomy to serve the interests of a data-driven system.

Conclusion

As gatekeepers of digital knowledge, search engines wield significant power in shaping societies and influencing decisions. Search engines are often perceived as endless libraries containing a wealth of reliable information. In reality, search engines are highly manipulated systems that influence human behavior often without detection. By allowing search engines and internet companies to evade responsibility, we risk undermining the integrity of online informational ecosystems and compromising the privacy and rights of users.

A hard edit could remove many words and some sentences. You could get 15% or 20% of the space back, which would allow you to address the central missing step in the argument, noted above.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


HafsahHanifFirstPaper 4 - 08 May 2024 - Main.HafsahHanif
Changed:
<
<
Revision 3 is unreadable
>
>
META TOPICPARENT name="FirstPaper"

[WORK IN PROGRESS] Searching for Truth: The Illusion of Search Engines

-- By HafsahHanif - 01 Mar 2024

[I'm so sorry, I made an edit to my paper and it has moved to the top of the list. I'm still working on it and will have it ready by May 10!]

Introduction

Google, one of the most popular search engines today, has expanded into a brand, browser, and verb. Among the first information retrieval services to incorporate the ranked relevance algorithm, the company has made its name synonymous with the act of searching, making it seem invincible, omniscient, and apolitical. Search engines, however, are merely the product of algorithms and computations that may be manipulated by their programmers and users. While search engines are highly susceptible to manipulation and the dissemination of misinformation, their creators often evade responsibility for the harm they cause because their creators are protected by the First Amendment.

Search Engines Are Powerful but Dangerous Tools

With approximately seventy percent of the global population using the internet, search engines, the backbone of internet use, hold immense power in today’s society. While they are a convenient tool for individuals to quickly learn about a variety of straightforward topics, search engines also serve as an outlet for people to ask difficult or otherwise unacceptable questions and validate personal assumptions. Yet, the results retrieved through these services are not fact-checked or necessarily credible sources.

To better understand why a search engine should be considered at least partially responsible for the impact it may have on individuals, let us consider an analogous situation using conventional libraries. Assume someone interested in learning more about race-based violence ventured into a section in the library about social issues. If the shelves were structured such that the most prominent books placed at the front were written by alt-right, racist individuals who believe that the white race is superior to all others, while hidden at the very back of the shelves, behind all of the propaganda and misinformation, were well-substantiated books about the reality and impact of segregation, forced poverty, and police brutality, then we would likely first turn to the library as the culprit in allowing such misinformation to be spread. And if the alt-right groups had paid for their books to be placed far ahead of others, this would be even further cause for concern. This is analogous to what we see on the internet---search engines are structured to place certain results ahead of others, whether because individuals pay to have their pages placed higher in rankings or because an underlying algorithm learns what its users are interested in, rather than what may be accurate information.

The case of convicted murderer Dylann Roof underscores the profound impact of propaganda published and promoted by online platforms in shaping extremist ideologies and inciting violence. Trying to make sense of the Trayvon Martin case, Roof searched “black on White crime” on Google. The first result Roof saw was from the white nationalist Council of Conservative Christians; these views ultimately motivated his plan to massacre Black civilians in an attempt to start a race war.

It's not clear why an index should be considered responsible for the consequences of people looking things up in it. It would be an improvement to address this question instead of assuming it away.

Roof’s case begs the question of why and how such extremist propaganda and unreliable information was not only disseminated but also highly recommended by Google’s search engine.[4] Many believe that the ranking system employed by Google is an indicator of reliability and credibility. Indeed, research has shown that people often only look at the first page of results produced by a search engine. The reality, however, is that the top-ranking sites can be manipulated or curated to reflect the biases of the user. [[https://blogs.lse.ac.uk/impactofsocialsciences/2019/06/09/book-review-algorithms-of-oppression-how-search-engines-reinforce-racism-by-safiya-umoja-noble/][These websites may belong to those who can pay premium prices to have their site appear higher on the results page or those who have managed to manipulate the algorithm through search engine optimization tactics.] Moreover, the algorithms underlying search engines are built to rely on collected data and web history. Using this collected information, search engines learn what the user’s preferences are and then direct them toward content that confirms or reaffirms their beliefs and biases.

First Amendment Protection

Despite the dangers posed by search engines, these systems continue to enjoy protection under the First Amendment—a notion that has gained judicial support in recent years. Indeed, Google’s search algorithm has historically “[[https://www.learningforjustice.org/magazine/fall-2017/the-miseducation-of-dylann-roof][fail[ed] to find the middle ground]] between protecting free speech and protecting its users from misinformation.” The sentiment underlying this protection is that search results reflect the “individual editorial choices” of the company—a form of speech that is protected by the First Amendment. By “select[ing] what information it presents and how it presents it,” Google is exercising its right to free speech in a manner akin to that of a newspaper or forum editor—all without government interference.

But this argument is inherently flawed. The information presented by Google’s search engine is not necessarily reflective of the company’s “editorial choices”; rather, an algorithm developed by Google but designed to learn through user engagement is often making “editorial choices” when determining which results to produce for a given search. As deeply flawed as human beings are, algorithms are not only far more susceptible to error but also lack judgment and the ability to discern right from wrong. While programmers may install safeguards to train algorithms on how to detect hate speech and unreliable information in order to suppress such content, ultimately these protections are easily circumventable. By extending First Amendment protections to an artificially intelligent algorithm, courts have diminished the core principles of free speech and eroding human agency and accountability in the exercise of free speech.

Privacy Concerns

Search engines also raise concerns about individual privacy. Companies such as Google, which have established their brand around the prowess of their search engine, meet the demands of their users by curating results to provide relevant and personalized information tailored to individual preferences and search histories––“[e]ach link, every time you click on something, it’s basically telling the algorithm that this is what you believe in.” While users believe search engines are providing them with information that is useful to them, the reality is that the algorithm is “[[https://www.learningforjustice.org/magazine/fall-2017/the-miseducation-of-dylann-roof][steer[ing] them]] toward content that confirms their likes, dislikes and biases.”

In the most extreme cases, such reinforcement can contribute to the radicalization of individuals, exemplified by cases like that of Dylann Roof, who became immersed in a spiral of increasingly extremist ideologies. Most often, however, the concern with collecting web history and data is that search engines sell this information to advertisers and other third party companies in order to generate profit. For example, Google uses personal data in two ways: “It uses data to build individual profiles with demographics and interests, then lets advertisers target groups of people based on those traits” and “[i]t shares data with advertisers directly and asks them to bid on individual ads.” In both instances, user privacy and protection is diminished, as individuals’ personal information is commodified and exploited for targeted advertising purposes. As a result, users find themselves directed towards specific websites or information, ultimately undermining individual autonomy to serve the interests of a data-driven system.

Conclusion

As gatekeepers of digital knowledge, search engines wield significant power in shaping societies and influencing decisions. Search engines are often perceived as endless libraries containing a wealth of reliable information. In reality, search engines are highly manipulated systems that influence human behavior often without detection. By allowing search engines and internet companies to evade responsibility, we risk undermining the integrity of online informational ecosystems and compromising the privacy and rights of users.

A hard edit could remove many words and some sentences. You could get 15% or 20% of the space back, which would allow you to address the central missing step in the argument, noted above.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


HafsahHanifFirstPaper 3 - 08 May 2024 - Main.HafsahHanif
Changed:
<
<
Revision 2 is unreadable
>
>
Revision 3 is unreadable

HafsahHanifFirstPaper 2 - 22 Apr 2024 - Main.EbenMoglen
Changed:
<
<
Revision 1 is unreadable
>
>
Revision 2 is unreadable

HafsahHanifFirstPaper 1 - 01 Mar 2024 - Main.HafsahHanif
Changed:
<
<
Revision 1 is unreadable
>
>
Revision 1 is unreadable

Revision 6r6 - 09 May 2024 - 15:49:37 - HafsahHanif
Revision 5r5 - 08 May 2024 - 17:06:00 - HafsahHanif
Revision 4r4 - 08 May 2024 - 05:32:05 - HafsahHanif
Revision 3r3 - 08 May 2024 - 03:12:15 - HafsahHanif
Revision 2r2 - 22 Apr 2024 - 14:34:18 - EbenMoglen
Revision 1r1 - 01 Mar 2024 - 18:10:44 - HafsahHanif
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM