Computers, Privacy & the Constitution

What's Missing?

-- By MathewKenneally - 14 Mar 2015

What’s Missing?

An intelligence officer at Columbia put a simple proposition to the audience. The State is offering you security from terrorism and all it’s asking you to give up is a smidgen of privacy, what a bargain! Private technology companies say “here is a free service and will use your data to customize your experience, no catch!”. The “catch” is disguised by the characterization of the privacy debate as a singular transaction. Privacy is environmental. The transaction occurs incrementally and has long-term effects. In short, it is the surrender of power by individuals to corporations and government in the form of information. This long-term view is missing from the current debate.

Okay, I recognize this idea. Now your task is to expand on it or take it in a new direction. That could be announced here.

People unconcerned by private companies holding data often characterize the use of that data as innocuous. A typical response is “Facebook tried to sell me something I bought three weeks ago” or “I’ll just ignore the ads”. It is true that current online advertising is unsophisticated, however, this presumes it will not improve. Tech companies are regularly experimenting on how to predict human behavior. The Artificial Intelligence crunching the numbers is improving.

The risk is that with exponential improvement the business will be able to take full advantage of the asymmetry of information between consumers and corporations.

Computers can through browsing data identify our unconscious preferences. They can already identify users that are pregnant before they know, and users whose relationships are in trouble before the users know.

The data has two advantages. It can be used to advertise a product to a person right at the time they can buy it. Moreover, differential pricing can be applied to individuals. This is the entire aim of Amazon. Accumulate years of consumer data and use it to calculate what an individual wants and what that individual will pay.

Put this way, suppose you enter a liquor store with no fixed price, undecided if you will make a purchase. Do you want the owner to know you have a weakness for whiskey? Do you want the owner to know that you have a $150 and best on previous behavior you are likely to spend it all? The loss of privacy undermines the power of the individual.

Another example is employment. Today we choose how to market ourselves to employers. Tired of job and your town? Hop a plane, cross the country, exaggerate your CV and pitch the new you to employers.

If employers can access information from data aggregation companies, or through simple Google searches, the opportunity for the individual to reinvent herself is lost. Moreover, a combination of education data and credit history allows employers to offer as low a salary as possible.

In each example the problem is the same, the corporation knows more about us than we know about it, or in some cases ourselves.

The same can be said about the trade-off in the security context. Recently, the Australian Government introduced a meta-data retention scheme. It was justified by the need to fight terrorism and child pornography. Internet Service Providers (ISP) will hold the data for two years that can be accessed by authorities in an investigation of any crime.

This has been characterized as a simple trade off, a small amount of privacy for security. The law, coupled with increasing technological sophistication, could radically alter the power between law enforcement and citizens.

First, the ability of the authorities to use meta-data for any crime, may lead us to a “total enforcement state”. The police can enforce any law because physical limitations are removed. Metadata revealing online spending habits higher than reported income could allow low-level drug dealers to be targeted. Police could identify drug addicts by looking at users that that browse for health ailments commonly associated drug use. Facial recognition technology could enable the arrest of anyone with outstanding speeding tickets who attends a ball game. Tracking movements, interactions, and browsing could allow a state to identify individual’s sexuality.

Enhanced enforcement can undermine the community’s capacity to drive social change through low-level law breaking. For example, a huge number of Americans used cannabis in there 20s. These citizens have seen the cost of cannabis enforcement, but know from personal experience authorities over-state the risks of cannabis. The result: a dimming of popular support for the war on drugs. Further, homosexuality survived because people could do it in secret.

Or, it became impossible to deny civil rights, including marriage, to gay people once homosexuality wasn't done in secret anymore, and everyone had to see what "family secrets" otherwise allowed them to hide from themselves.

The consequences would be a loss of citizens to test the value of laws ourselves and to experiment with different lifestyles and ways of living not sanctioned. Instead the Government able, if it wishes, can enforce conformity.

Secondly, the use of meta-data could lead to the policing of people based on how they think, not what they do. As stated above, patterns of browsing can reveal sub-conscience thoughts and desires.

Suppose the authorities identified a pattern in the browsing data of persons before they commit a terrorist attack. The pattern may be that they google ISIS, or that they simply check Facebook more incessantly, or that their browsing becomes more frenetic. Once identified, who could object to the Government checking every users browsing to identify who “thinks” like a terrorist? So long as you do not think like a criminal (however it is they think) you have nothing to hide.

This is the other fundamental change, the capacity for the state to surveil thoughts. An oppressive state could identify suspected homosexuals, terrorists, or dissidents before they commit any positive acts.

This is not a basic exchange of data to allow the Government to check whose calling ISIS.

The trade off between privacy and security is not simple. The trade offs security agencies and private companies invite us to make will change the society we live in. Combined with advances in technology they have the potential to make citizens far less able to exercise agency in the commercial or political sphere.

No wonder Facebook nor the NSA put this in big bold letters in the agreement. Instead they just ask to click and move on.

It seems to me that the draft basically starts from my starting point and provides illustrations. But you can get more out of yourself here, if you ask where you want to go from the environmental nature of the privacy problem. Rather than being compelled to strengthen the assertion by examples, assume the reader is with you so far, and take whatever the next steps are that this shared hypothesis allows you.

The Temptations of Predictive Policing

-- By MathewKenneally - 4 May 2015

The aspiration of online advertising is to offer something to a customer at the moment they are most likely to purchase it. A similar aspiration exists for law enforcement, to identify a criminal before they commit a crime: "predictive policing". We should be wary of granting law enforcement access to personal metadata. The trade-off may seem reasonable in the short term, but in the long term surveillance to identify “future criminals” could compound discriminatory policing, undermine privacy, and stifle political dissent.

For instance, Australia recently passed legislation requiring ISP’s to hold metadata of all users for two years. Prime Minister Abbott advocated for the legislation arguing that to not pass the bill would be requiring law enforcement to unilaterally disarm in the fight against terrorists and pedophiles. Warrants to access the data have been deemed unnecessary, because accessing metadata is not sufficiently intrusive. The opposition gained a concession to exclude browsing history from the data collected. Unsurprisingly, the law passed easily. The short-term trade off, an insignificant amount of privacy for security seems pretty good.

Contrary to the claims by government metadata is extraordinarily intrusive. Metadata is usually defined as data about a message, as opposed to the content. For example, it is the date, time, sender, location of the sender, and recipient of an e-mail, but not the text of the e-mail. Metadata can be used to track a person’s movements, reading, and contacts. As Justice Sotomayar observed merely tracking a person’s movement enables government to ascertain “their political and religious beliefs, sexual habits…”. Metadata can identify patterns in an individual’s behavior not apparent to the individual, revealing unconscious thoughts. This is why Edward Snowden said, "I would prefer to be looking at metadata than looking at content, because it's quicker and easier, and it doesn't lie".

Predictive policing is already a reality. The Chicago Police Department used recent arrest records and historic crime data to create a “heat list” of the 400 people in an area most likely to commit a violent crime. The police visited some on the list, to warn them that crime does not pay. The program relied on research that shows a correlation between an individual’s social network, and likelihood she will commit a violent crime.

Data analysis makes it easier to identify networks of likely criminals. Arrest someone for a violent offense; check her metadata; find her closest ties; you have your next offender. Using metadata a “heat list” can be assembled from the entire population without any arrest. It allows authorities to map “types of people”. Police could search for “groups” they believe are likely to commit an offense. For example, police may wish to identify all young Muslim men connected to a radical preacher.

Individuals, who have committed no offense, may find themselves subject to police scrutiny merely because they associate with or have similar interests, as offenders.

It could be argued that policing by data may avoid discrimination. Algorithms cannot be racists. Data generated from previous arrests and convictions may reflect previous biases. The interpretation of data can itself be biased. What we learn from data depends on what we ask in our analysis. The algorithm may provide a statistical justification for increasingly discriminatory enforcement.

There is a risk of criminalizing the “types of persons” put on the “heat list”. If these people are subject to more police surveillance, more offenses are likely to be detected. The data’s prophecy becomes self-fulfilling. Harcourt made this argument in his critique of the data supporting the effectiveness of broken windows policing. He suggests that misdemeanor arrests rose in “high crime areas”, not because there was a high level of disorder in those areas, but because there were more police placed in those areas on the look out for misdemeanor offenses.

Data can also aid predictive policing, by helping to identify how “criminals” think. Suppose a security agency compares the online behavior of those who perpetrated lone wolf terrorist attacks and identifies a pattern of sub-conscious behavior. A search of held metadata could identify all the “likely terrorists”.

This would allow police to treat thoughts as suspicious. It is a system that could lead an individual to be the subject of surveillance because she unconsciously thinks like a person who might do something wrong. Put another way “she googles like a criminal”.

A system established to identify terrorists and criminals could be used to suppress political dissent. Terrorists are, after all, political dissidents that turn violent. People likely to adopt politically subversive views could be identified through their reading and social networks before they even develop their beliefs. The police could investigate, or pay a visit to warn them that unorthodox ideas do not pay. Recently, in Baltimore, the media and law enforcement have sought to blame political activists for violence. Why would police, armed with tools to identify potential activists in advance, not seek to stop such protests before they occur? We should not expect them to "unilaterally disarm".

The response to these concerns is that current metadata collection systems contain safeguards. The Australian program excludes browsing history; the USA restricts metadata use to national security and foreign intelligence matters, not domestic law enforcement.

These limitations might be hard to maintain. The incremental requests to surrender “just a little more” privacy to prevent frightening crimes are appealing. Politicians are short-term creatures. After a terrorist attack, they want to ensure they cannot be blamed for the next event. Communities are easily frightened, and promises of security, however speculative, are hard to resist.

The short-term trade offs to enable predictive policing seem slight. Shrewdly, governments do not demand we surrender all our freedoms at once. The long-term questions are grave. Do we want to be policed based on our thoughts and associations? Can a state that preempts dissent be democratic? If these questions are not asked we risk constructing a dystopia by increments.

Navigation

Webs Webs

r5 - 04 May 2015 - 14:23:36 - MathewKenneally
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM