Angela Merkel has been talking about the Internet a lot recently. Just last Tuesday, for instance, while addressing important figures from the TV and publishing industries, she discussed the power of companies like Facebook and Google -- more specifically, the secret algorithms they use to sort search results and to present custom news feeds to users. The problem, Merkel argued, was that no one outside of these social media giants knows what criteria they use to filter data. It's not only problematic for political parties, she said, but for society as a whole.
The chancellor's sudden interest in the nuances of IT policy was related to one thing: Germany's upcoming federal election campaign. Merkel knows that Facebook and Google could play an important role in deciding who's going to lead the country after the vote, which is expected to take place in late 2017.
When it comes to digitalization, not only German industry has had to reinvent itself -- the business of politics has also been turned upside down, especially during campaign season. But far fewer people pay attention when technologies disrupt the political landscape than when they change our economy.
Should we really be afraid of the power of algorithms? Can Facebook and Google secretly influence the way we form political opinions and affect the outcome of elections? Could Google's preferred candidates be given an unfair advantage?
The answers to these questions are unequivocal: Yes they can; and yes they are.
Facebook's wide reach alone makes the social network an important tool for shaping, or manipulating, other people's opinions. With around 1.8 billion users, it is the global mass medium of our era.
In Germany, Facebook has close to 29 million users, more than 22 million of whom consult it every day. That's a quarter of the country's entire population. Google's numbers are just as impressive. With over 90 percent of all search engine traffic routed through its servers, a modern-day gatekeeper like Google has an historically unprecedented influence over its users' lives. Even the order in which results appear on a search page can have a huge impact, affecting how well a product sells, for example, or making or breaking a candidate's ranking in the polls.
These days, online platforms have significant influence on our political awareness and they don't even deny this is true. They market their power quite openly: Advertising revenue from political parties and candidates is all part of the business model, and Google and Facebook specifically target political leaders to win them over as customers. Facebook, for example, has hosted special workshops in Berlin for politically active Germans, and the social media network published a German-language "Guidebook for Politicians and Office Holders," which advertises and explains "political campaigns on Facebook."
In the most recent edition, from March 2016, the authors use case studies to explain what data-driven campaign strategies can accomplish. Facebook can be leveraged to target users' specific political interests. In February of this year, for example, there were 44,000 people in Germany between the ages of 25 and 50 interested in the issue of minimum wages. Depending on the size of the advertising budget, some, most or even all Facebook users "who meet the defined criteria" could be shown a party's or candidate's advertisement. Politicians, for example, can target people who have "liked" their pages, as well as all of their friends -- or all people in a certain age range, such as first-time voters who live close to their electoral offices.
Great Power, Great Responsibility
Facebook is proud of its success. In its "Guidebook," it even touts the victory of the British Conservative Party in 2015 as evidence of what its data-driven campaigns can do. The company says it played "a decisive role" by reaching out to target groups about the issues they cared about, especially in districts with close margins. The four-month campaign, they claimed, was the first time a party in Europe had "used Facebook ads in such a sophisticated and targeted manner."
In Germany, many fringe groups and far-right political parties have used Facebook to attract and mobilize supporters. Even Pegida, the xenophobic protest movement that began in Dresden, started as a Facebook group.
But paying to court specific voters online is only one of many ways to exert political influence -- and not even the most problematic, given its similarities to election posters or TV commercials.
What bothers Merkel -- and many other politicians around the world -- are the subtler forms of influence. At the media conference in Munich, the chancellor criticized the fact that algorithms latched onto previously articulated interests and simply presented more of the same. The effect, she said, only exacerbates people's entrenchment. Other critics fear that social media networks run the risk of becoming hidden political actors, ones that not only use their influence to make money but also to change the course of entire elections -- without anyone knowing. Since they regard their algorithms as business secrets, they are protected from the prying eyes of regulators.
Facebook knows some of its users so well that it has extremely clear notions of their political leanings. Characteristics such as age, friends, place of residence, pages visited, among many others, help the company to categorize them politically. The "Like" button is also a big help. People who "like" the fan pages of Angela Merkel and her party, the Christian Democratic Union, aren't very likely going to vote for the Left Party. In the United States, Facebook openly organizes people into one of three categories: "liberal," "moderate" or "conservative."
Some scientists are worried that Facebook could manipulate elections by intentionally mobilizing certain groups of voters. For years, the social networks have been sending its users' reminders to go out and vote before elections and plebiscites, calling upon them to make use of their right to vote and take part in democracy. But what if during the next election, Facebook were to only send such reminders to certain people who, based on their interests, age or place of residence, were highly likely to vote a certain way? What if Facebook began only mobilizing members of certain political movements?
During the US Congressional election in 2010, Facebook activated voting reminders on 61 million users' newsfeeds. A joint analysis with researchers from the University of California revealed that the social network was directly responsible for boosting voter participation by 340,000 people. In close races, that could be more than enough to swing the outcome.
Companies like Facebook could "flip an election," the Internet researcher Kate Crawford told SPIEGEL ONLINE. But such "enormous power" also comes with "substantial ethical questions," she added.
Neutral or Not?
A recent US study investigating Google's potential influence on electoral decisions came to a similar conclusion. For 20 percent of undecided voters, Google's search algorithm had the power to change the way they voted, the study found. The scientists spoke of a "search machine manipulation effect." Both Google and Facebook insisted they were not driven by a political agenda and maintained neutrality.
"Facebook would never try to control elections," the social network's chief operating officer, Sheryl Sandberg, said in 2014. Google, for its part, claims it has never altered search results to force one electoral outcome or another. "If we were to stray from that principle, people's trust in our company would be undermined," the company said in a statement.
But even the possibility of unintentional or practically uncontrollable influence over a democratic election is dangerous. It would also be naïve and negligent for policymakers to rely entirely on the internet giants' promises of neutrality.
There are legitimate doubts to be had about the networks' self-proclaimed neutrality -- and never before have they been so loudly projected as during the US election cycle that comes to an end on Tuesday. The Trump camp in particular has bemoaned a liberal bias for Hillary Clinton. One source of their frustration was a leaked screenshot that showed a Facebook employee asking Mark Zuckerberg: "What responsibility does Facebook have to prevent a President Trump?" Obviously there was never a doubt as to the company's capacity for doing such a thing.
The influence of Google and Facebook goes way beyond election campaigns. Indeed, the people in charge of their algorithms are constantly making decisions that are politically relevant. Politically controversial material, for instance, is often "censored." Recently, an alliance of more than 70 civil rights groups complained that Facebook was regularly deleting posts that documented human rights violations.
The networks are also using their power to promote their own political concerns. Back in 2012, when Google launched a petition against a planned anti-piracy law, 4.5 million people signed it in a single day. The initiative was a shot across Washington's bow, a show of Google's power. Google and other opponents of the measure ultimately got their way.
Is Germany about to have its first-ever data-driven election campaign? Every major German party has sent observers to the US to learn from the Democrats' and Republicans' social media teams. Back in Germany, however, Facebook has an image problem: It is, above all else, associated with hate speech. Many lawmakers and ministers are faced with hateful online messages every day.
Now, Facebook is trying to combat its poor reputation with a classical advertising campaign: in newspapers and on TV.
The article you are reading originally appeared in German in issue 45/2016 (November 5, 2016) of DER SPIEGEL.