Living by the Numbers Big Data Knows What Your Future Holds

DPA/ Google

By , and

Part 7: A Tyranny of Data?


Business models like Kreditech's illustrate the sensitivity of the issues that many new Big Data applications raise. Users, of course, "voluntarily" relinquish their data step by step, just as we voluntarily and sometimes revealingly post private photos on Facebook or air our political views through Twitter. Everyone is ultimately a supplier of this large, new data resource, even in the analog world, where we use loyalty cards, earn miles and rent cars.

Perhaps many people do so with so little inhibition because what happens to our data often remains ambiguous. To whom and how often is our data sold? Are these buyers of data also subject to rules for deleting the data and preserving anonymity? And what will happen, for example, with Kreditech's credit profiles if the small business is ever acquired by a larger company or goes bankrupt?

An attempt by the established credit reviewers at SCHUFA to launch a pilot project on social scoring, together with the Hasso Plattner Institute, revealed just how sensitively the public reacts to such issues.

As with Kreditech, the project sought to analyze data from Facebook, Twitter and other social networks and examine its role in determining creditworthiness. Merely the announcement of the project triggered angry protests, and the effort was promptly abandoned.

There was an even greater storm of indignation when many drivers realized that their navigation devices don't just help them find the best route to their destinations, but can also be used against them. TomTom, a Dutch manufacturer of GPS navigation equipment, had sold its data to the Dutch government. It then passed on the data to the police, which used the information to set up speed traps in places where they were most likely to generate revenue -- that is, locations where especially large numbers of TomTom users were speeding.

Pre-programmed Conflicts

TomTom's CEO issued a public apology, saying that the company had believed that the government wanted the data to improve traffic safety and reduce road congestion. TomTom had not anticipated the use of the data for speed traps, he said.

Similar conflicts are practically pre-programmed into the technology, especially as a central conflict is inherent in its development. Big Data applications are especially valuable when they are personalized, as in the case of credit checks and individual medicine.

Personalized profiles, which bring together a wealth of information, from expressions of opinion on Facebook to movement profiles, provide companies with tempting possibilities. For instance, if someone "likes" a particular pair of jeans on Facebook, the storeowner could send a discount coupon for precisely the same brand of jeans to that person's mobile phone the next time he or she enters the store.

This may be appealing to retailers and some consumers, but data privacy advocates see many Big Data concepts as Big Brother scenarios of a completely new dimension.

So far, many companies have tried to dispel such fears by noting that the data they gather, store and analyze remains "anonymous." But that, as it turns out, is not entirely accurate, in that it sells the power of data analysis radically short. Take the analysis of anonymous movement profiles, for example. According to a current study by the online journal Scientific Reports, our mobility patterns are so different that that they can be used to "uniquely identify 95 percent of the individuals." The more data is in circulation and available for analysis, the more likely it is that anonymity becomes "algorithmically impossible," says Princeton computer scientist Arvind Narayanan.

In his blog, Narayanan writes that only 33 bits of information are sufficient to identify a person.

Heated Controversy

From the standpoint of businesses, the slightly schizophrenic attitude of consumers is the real crux of the issue. On the one hand, we have become shockingly forthcoming -- and apparently accessible -- online. Yet we ascribe the most sinister of motives to those who would analyze that data and collect more.

A study by New York advertising agency Ogilvy One concludes that 75 percent of respondents don't want companies to store their personal data, while almost 90 percent were opposed to companies tracking their surfing behavior on the Internet.

This conflict explains the heated nature of the current controversy over the proposed new European data protection directive. If the European Commission's plans, which also include a "right to be forgotten" on the web, become a reality, many providers could see their Big Data growth fantasies in jeopardy. This is one of the reasons Brussels currently faces a barrage of lobbying from the likes of Amazon, Google and Facebook.

But for a modern society, an even more pressing question is whether it wishes to accept everything that becomes possible in a data-driven economy. Do we want to live in a world in which algorithms predict how well a child will do in school, how suitable he or she is for a specific job -- or whether that person is at risk of becoming a criminal or developing cancer?

Data Tyranny

Is it truly desirable for cultural assets like TV series or music albums to be tailored to our predicted tastes by means of data-driven analyses? What happens to creativity, intuition and the element of surprise in this totally calculated world?

Internet philosopher Evgeny Morozov warns of an impending "tyranny of algorithms" and is fundamentally critical of the ideology behind many current Big Data applications. Morozov argues that because formulas are increasingly being used in finance and, as in the case of Predictive Policing, in police work, they should be regularly reviewed by independent, qualified auditors -- if only to prevent discrimination and abuses of power.

A dominant Big Data giant once inadvertently revealed how overdue a broad social and political debate on the subject is. Google Executive Chairman Eric Schmidt says that in 2010, the company toyed with the idea of predicting stock prices by means of incoming search requests. But, he said, the idea was discarded when Google executives concluded that it was probably illegal.

He didn't, however, say that it was impossible.

Translated from the German by Christopher Sultan

Article...


Comments
Discuss this issue with other readers!
1 total post
Show all comments
Page 1
johntmaher 07/03/2013
1. Big Data
There is a nascent American resistance to the Biopolitical coercion which Big Data conveys to the state. The BDR should expect Stasi-type control and manipulation if they sign into the EU-US (TTIP) Agreement. The idea that anything on a PC connected to the internet is secure is incredibly naive. Construct a digital border which you can or admit to being a collection of vassal provinces. As Merkel said "shitstorm" ahead.
Show all comments
Page 1

© SPIEGEL ONLINE 2013
All Rights Reserved
Reproduction only allowed with permission


TOP
Die Homepage wurde aktualisiert. Jetzt aufrufen.
Hinweis nicht mehr anzeigen.