Law in the Internet Society

View   r3  >  r2  ...
RaulCarrilloFirstEssay 3 - 21 Jan 2015 - Main.RaulCarrillo
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Financial Privacy, Digital Redlining, and Restoring the Commons

Line: 80 to 80
 

\ No newline at end of file

Added:
>
>

Revised - Financial Privacy, Digital Discrimination, and The Commons

-- By RaulCarrillo? - 21 Jan 2015

Money and Credit in the Digital Era

We have spent significant time in this course discussing invasions of privacy, including within the sphere of consumer finance. Data-mining assaults our anonymity and autonomy, supposedly compensating us with convenience. Yet the harvesting of financial information outright punishes many of us. Data can be siphoned and subsequently used to exacerbate some of the oldest socioeconomic hierarchies in the country, excluding minorities from the few options for money and credit available on the socioeconomic periphery.

In a collaborative essay for The Morningside Muckraker, I recently wrote about how free software and free culture advocates could benefit from updated understandings of public finance, particularly from the Modern Money (MM) paradigm. This school of thought highlights how the United States and other ¨monetarily sovereign" countries--those with fiat currencies and floating exchange rates--create money at zero-marginal cost. That is, central banks in these countries simply create currency with keystrokes, and no matter how much they create, unlike businesses or households, their governments cannot possibly ¨run out.¨ This does not mean central banks can create purchasing power at zero marginal cost--we still have to account for price stability (inflation)--but it does mean monetarily sovereign governments face no solvency constraint per se.

This is an important legal and economic fact to grasp because it renders most arguments for austerity utterly outdated. Yet, most governments still unnecessarily attempt to balance budgets on the backs of the poor and the middle class, so those who need purchasing power are generally forced to seek credit from banks or more nefarious entities in the private sector.

Algorithms and Legalized Discrimination

This isn’t going well. Worse than usual, in fact. Last spring, in a piece entitled “Redlining for the 21st Century”, Bill Davidow of The Atlantic sketched some ways in which firms employ algorithms to outright deny people loans, or charge them much higher rates, based solely their inferred race. For example, many watchdogs have noted that mortgage providers can now determine a customer’s ZIP Code via their IP address, and then proceed to charge them based on the neighborhood where they currently live. If a human employee did this, that individual would be acting in clear violation of The Fair Housing Act of 1968. Yet, because an algorithm is the actor, the practice often goes unexamined, and people who might otherwise receive credit at a reasonable rate are harmed.

In class, we’ve touched on the legal history of presuming machines to be innocent. In a recent law review article, University of Maryland Law Professors Danielle Keats Citron and Frank Pasquale detailed how credit scoring systems have always been “inevitably subjective and value-laden” despite their ostensibly objective simplicity. Although the systems in question were initially built to eliminate discriminatory practices, they can only be as free from bias as the software behind them, and thus only as righteous as the values of developers and programmers. But the law generally ignores this. In fact, credit bureaus are not required to reveal how they convert data into scores. Those processes are deemed “trade secrets.”

In The Black Box Society, Pasquale notes that although the majority of people negatively impacted by the structure of a particular algorithm may be Black, Latino, women, LGBTQ, or people with disabilities, the algorithms are almost entirely immune from scrutiny. To make matter worse, Title VII of the Civil Rights Act of 1964 has been deemed “largely ill equipped” to address the discrimination that results from data-mining. There’s no applicable law to enforce here.

Even more awfully, digital discrimination isn’t limited to private finance. In an essay entitled [[http://newamerica.org/downloads/OTI-Data-an-Discrimination-FINAL-small.pdf[“Big Data and Human Rights”]], Virginia Eubanks, an Associate Professor of Women’s, Gender, and Sexuality Studies at SUNY Albany, notes that although New York State social service providers initially utilized algorithms to expose the discrimination of its own employees, the tables have turned. Faced with the fiscal burden of supplying benefits in the wake of recession, the state commissioned technologies that replaced the decisions of social workers, allowing bureaucracies to deny benefits behind the auspices of objective machines. Now, computers make choices about spending based on “time worn, race and class motivated assumptions about welfare recipients.”

Assaulting the Commons

It is obvious that the poor have always struggled to gain money and credit. Government spending and private lending are inherently discriminatory. Yet technology is producing new forms of discrimination-- as well as reviving old ones. The racial wealth gap in the 20th century, in particular, can be greatly explained by redlining, as well as general housing and lending discrimination. These practices were supposed to have been eradicated by the Civil Rights Movement and subsequent legislation, at least allowing people to be discriminated against merely because they were poor, and not because of other demographic factors. But the practices occur.

Thus, if economic and racial justice advocates do not understand Big Data, we will always be one step behind. For example, would-be-reformers need to know that attempts to access credit by people of color may be punished solely by virtue of digital method. We may be barred from economic advancement by operating within an infrastructure that is arguably ours by birthright.

Digital discrimination fits into Moglen’s narrative of an assault upon ¨the commons¨, two layers deep. In the age of austerity, the government refuses to subsidize economic existence via public coffers despite clearly having the means to do so, and thus many people must rely on private credit to survive. Yet surveillance is making this option substantially more difficult for certain groups of people. The government and its licensed agents—banks and other lending institutions—harvest data within what should be a free knowledge commons and use it to either outright deny credit to historically marginalized groups, or charge them exorbitantly, thus depriving them of the small shot they might have had at security or mobility. Although financial entities certainly use social surveillance to prey on many people nearer the apex of society, surveillance can be a death knell for minorities on the periphery.


Revision 3r3 - 21 Jan 2015 - 23:30:26 - RaulCarrillo?
Revision 2r2 - 04 Jan 2015 - 19:52:32 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM