Computers, Privacy & the Constitution

Big Tech Poses a Threat to Privacy and Free Expression

-- By AbdullahiAbdi - 12 Mar 2022 [Updated 28 April 2022]

Introduction

Technology companies have somehow managed to create a perceived sense of control over how masses communicate over the internet. While technically anyone on earth can stand up a webserver and create communication platforms, the reality is that giant private technology companies such as Facebook are becoming governors of the marketplace of ideas. They control how those who use their platforms may speak. This greatly threatens freedom of expression and speech.

Another thing that is often overlooked is these platforms’ pervasive surveillance of reading people’s private data and patterns of behavior. This again threatens not only freedom of expression but also freedom of thought. Technology should be used to serve the people, rather than using people’s data against them.

Since I won’t be able to delve into all the aspects and the threats of the pervasive surveillance and how big tech restrict free expression, in this paper, I will explore how Facebook’s enforcement of its content moderation rules curtail free expression.

Facebook’s moderation policy

Freedom of expression is a core human right protected under international law and in domestic law systems around the world. It is a core right, encompassing the right to receive, hold, and share opinions, that is essential for democratic governance. But increasingly big social media technology companies are restricting this right through content moderation policies. Moderation policies are a set of rules and practices used by tech companies to regulate content shared on their platforms.

These companies often try to combat what is viewed as harmful content but run the risk of silencing protected speech. Intervening with or removing content affects the rights to freedom of expression and privacy, and can easily lead to censorship.

There are three major problems associated with Facebook’s moderation policies and practices. First; there is overreliance on automated algorithms leading to incorrect removals. Second; there is often no clear appeal process for negatively impacted individuals and thirdly; there is undue government manipulation of Facebook leading to censorship.

Before I joined Columbia Law School, I worked on a project that documented how Facebook deleted several accounts of Somali journalists for allegedly violating Facebook’s Community Standards. All the journalists whose Facebook accounts were disabled said they did not understand why their accounts were shutdown. They were not also given prior warnings and were not afforded an opportunity to appeal against the removal of their accounts.

My investigation revealed two main reasons as to why these accounts were deleted by Facebook.

Use of automated algorithms

First, some of the journalists whose accounts were deleted were flagged through automated algorithms. Automated removals have long been criticized as more susceptible to error than human reviewers. These mistakes particularly affect Facebook users outside western countries since Face-book’s algorithms only work in certain languages, and autoŽmated tools often fail to adequately account for context or political, cultural, linguistic, and social differences.

One way Facebook is trying to address this problem is by employing content moderators to conduct the review, but recent media reports revealed that even where Facebook employ people to do this work, the staff are overworked and unable to make individualized decisions on these contentious and complex issues in the time afforded. For example, in one of its facilities in Africa, Facebook required its staff to moderate hours and minutes long videos and other content and make decisions within one minute.

Government interference

The other reason is government officials reporting critical journalists and individuals to Facebook.

Somali government officials were deliberately reporting the accounts of independent journalists critical of the authorities to Facebook and claiming these accounts violated Facebook Community Standards. Facebook appeared to have been complying with these requests without adequate investigations. For example, the Facebook account of a prominent government critic was incorrectly disabled and was only reactivated after we questioned Facebook on why his account was disabled in the first place.

In removing these accounts—which often provided the only source of independent information to people in Somalia where the press is severely restricted by the government —Facebook was essentially facilitating state censorship.

International human rights standards

International human rights standards require private companies to ensure that human rights and due process considerations are integrated at all stages of their operations. The UN Guiding Principles on Business and Human Rights for example, requires all business companies including those in the technology sector like Facebook to respect human rights.

Civil society groups have been thinking of ways to address this phenomenon and have come up with possible solutions. The 2018 Santa Clara Principles, for example, are a civil society charter that outline minimum standards for companies engaged in content moderation and sets out five foundational principles including (1) human rights and due process considerations; (2) application of understandable rules and polices; (3) integration of cultural competence; (4) recognition of risks of State involvement in content moderation; and (5) integrity and explainability of the policies. The charter also sets out three operational principles that should guide moderation policies. These include, more transparency in content moderation—requires companies to publicly report the actions taken by companies to restrict or remove content—; give notice to individuals whose content is moderated; and to establish and appeal process for anyone who feels aggrieved as a result of a moderation policy.

Recommendations to Facebook

Since Facebook is an important forum of mass communication and its policies are impacting many people across the word, there should be some transparency mandate about their policy formation. Such mandate should require Facebook to share more information with the public, researchers and government on how they make content moderation policies.

Facebook should also not allow itself to be manipulated by government officials to restrict free expression of critical voices to authorities. There should be increased due diligence when assessing purported infringement of Community Standards. Lastly, there should be a clear appeals process for individuals including journalists, human rights activists and even politicians whose Facebook accounts are suspended or permanently deleted.


Navigation

Webs Webs

r4 - 28 Apr 2022 - 19:04:13 - AbdullahiAbdi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM