Interview with Oxford Internet Institute Head Can We Develop Herd Immunity to Internet Propaganda?
DER SPIEGEL: The online forum Parler was instrumental as a channel for communication in the run-up to the storming of the Capitol Building in Washington, D.C. Amazon has since moved to turn off the service from its cloud hosting services. Does this mean that the problem has been solved?
Howard: No, it would be nice to have a quick technical fix like that. But it’s more complicated than that.
DER SPIEGEL: But could the riot inside the U.S. Capitol have happened without social media?
Howard: Social media certainly helped the organizers, both in the long term and the short term. The sitting president Donald Trump for years now has been cultivating a community of people who love conspiracies and look for extremist, sensational stories. Social media makes it easy to find and target this very specific audience and leverage them. So, after laying the groundwork for a long time, he was able to suddenly ramp up his messaging and use a specific trigger when he needed it, which was his messages that mobilized his supporters to march on the Capitol. This kind of seamless communication with supporters would be much, much harder to establish with professional media as gatekeepers in between a president and his supporters.
DER SPIEGEL: Trump has been permanently blocked on Twitter. But isn't that ineffective given that his followers have already moved on to more welcoming platforms like TheDonald or Parler?
Howard: No, because the people who now use Parler are a tiny minority – at most 10 percent of the electorate is regularly on such fringe sites. This limits their reach in a wider context. And their extremely libertarian views means that platforms like Parler, for example, are choking up, because they are getting swamped by porn. That limits the reach of Parler, because many conservatives believe in free speech and might support Trump, but they don’t like meeting like-minded people surrounded by porn.
DER SPIEGEL: So, does that mean that these extremist sites are unimportant, because they can't really scale up?
Howard: No, unfortunately they play an important role. Platforms like Parler, TheDonald, Breitbart and Anon are like petri dishes for testing out ideas, to see what sticks. If extremist influencers see that something gets traction, they ramp it up. In the language of disease, you would say these platforms act as a vector, like a germ that carries a disease into other, more public forums.
DER SPIEGEL: You're saying that Parler serves as a kind of R&D lab for new and improved conspiracy theories?
Howard: Yes, and at some point a major influencer takes a new meme from one of these extremist forums and puts it out before a wider audience. It works like a vector-borne disease like malaria, where the mosquitoes do the transmission. So, maybe a Hollywood actor or an influencer who knows nothing about politics will take this idea and post it on the bigger, better known platform. From there, these memes escalate as they move from Parler to maybe Reddit and from there to Twitter, Facebook, Instagram and YouTube. We call this "cascades of misinformation."
DER SPIEGEL: Is this a one way street?
Howard: No, it is more like an echo chamber, because many users are on multiple platforms. Sometimes the cascades of misinformation bounce from country to country between the U.S., Canada and the UK for example. So, it echoes back and forth.
DER SPIEGEL: So, if you disprove a conspiracy theory on one platform and one language, it might just come bouncing back from another network or country, a little like the polio virus is nearly eradicated worldwide, except for some countries like Pakistan that serve as its reservoir?
Howard: That is an interesting comparison, Indeed, many platforms work like reservoirs for infectious disinformation that can go viral in other places. Within Europe, two reservoirs for disinformation stick out: Poland and Hungary. That is where conspiracy theories go, before they spread to other places. These cascades of misinformation are really hard to contain, a little like COVID-19.
Phil Howard, director of the Oxford Internet Institute (in Berlin in January 2015)Foto: Hilmar Schmundt / DER SPIEGEL
DER SPIEGEL: Do you think that new regulatory approaches by the European Union can make a difference, the Digital Services Act, which, among other things, aims to strengthen content moderation, in particular?
Howard: Yes, the Digital Services Act is very helpful. Europe has a pretty good record for policymaking in technology that is both consultative but, after the consultations are over, that also comes with some teeth. I think that Europe could have done more against disinformation during the last elections for the European Parliament. But, all in all, the EU is going in a good direction. It is hard to imagine that any U.S. agency or institution would be able to show comparable leadership, so in the U.S. this decision at this time falls to private corporations like Apple, Amazon and Google.
DER SPIEGEL: But Europe is not an island and the cascades of disinformation will keep washing through our networks
Howard: Of course, but the European regulations have ripple effects for the whole world. It would be technically difficult and thus costly for global services like Facebook to offer different standards of service in different regions of the world. So, Facebook basically has adopted the standards of the European Privacy Regulation (EPR) for users worldwide. Europe is doing a great service to citizens worldwide with its regulation.
DER SPIEGEL: Critics argue that this kind of regulation is stifling free speech.
Howard: No, that in itself is propaganda. I do think Europe and the European Commission have an opportunity here to actually strongly defend democratic discourse. And I think it is not about banning free speech, but about much more. Good regulation is about maintaining standards and public conversation. It could improve the quality of life for everyone, it could lessen the influence of polarization and hate speech and thereby strengthen the basis for a free exchange of ideas. I expect social media firms to be treated more and more like publishers. This pressure would make their engineers more likely to design their forums for consensus building rather than for conspiracy peddling. It would make them design for sharing high quality information, rather than sharing only the lowest quality information. And this cultivation of an open, fair marketplace of ideas can be a matter of life and death during the current COVID pandemic. Disinformation about COVID has cost lives, because it pushes some people to take unnecessary risks.
DER SPIEGEL: You just presented a new report on disinformation online. Have we already seen peak disinformation? Do you expect things to quiet down once Trump is gone?
Howard: That would be nice, but it is unlikely. Our findings point in the opposite direction. Our 2020 report shows that cyber troop activity continues to increase around the world. This year, we found evidence of 81 countries using social media to spread computational propaganda and disinformation about politics. This has increased from last years’ report, where we identified 70 countries with cyber troop activity. Media firms have taken some steps against this, but the problem keeps getting bigger. Public announcements by Facebook and Twitter between January 2019 and December 2020 reveal that more than 317,000 accounts and pages have been removed by the platforms. Nonetheless, almost U.S. $10 million has still been spent on political advertisements by cyber troops operating around the world.
DER SPIEGEL: Who is behind those disinformation campaigns?
Howard: Private firms continue to bid for manipulation campaigns. Over the last year, we have identified 63 new instances of private firms working with governments or political parties to spread disinformation about elections or other important political issues. We identified 21 such cases in 2017-2018, yet only 15 in the period between 2009 and 2016. In total, we have found almost U.S. $60 million was spent on hiring firms for computational propaganda since 2009.
DER SPIEGEL: It is striking that in some metrics in your study, the UK seems to have a strong disinformation strategy, along with countries like Russia, the Philippines and Malaysia. Why would a democratic country be so heavily invested in disinformation industries?
Howard: There’s a couple of reasons for that. One is that many consultants who do the innovative thinking about disinformation are usually based in the U.S.. So, when they try to find new clients they tend to go to Canada and the UK, because of the language and some similarities in the media markets. But also, within the European context, the UK has the lightest campaign regulations for elections, and the elections authority here is understaffed and overwhelmed, so that makes it perfect for disinformation agencies.
DER SPIEGEL: In Africa, it appears that Nigeria is becoming a hub for disinformation. What is the country's selling point?
Howard: In the past, there were a lot of individuals who used the street side internet cafés in Nigeria to do those simple fishing exercises. But what is happening in countries like Nigeria, Kenya, South Africa or Ghana at the moment is that political parties have started using disinformation campaigns. This has led to a more professional, targeted approach to online propaganda. But in many African countries these campaigns use mobile text messaging more than social media at this point, because SMS is more important.
DER SPIEGEL: Your report suggests that Russian customers are buying services for disinformation campaigns from Nigerian agencies. Why would well-funded Russian agencies buy disinformation services from a newcomer like Nigeria?
Howard: I guess it is about the cost of labor. If you want to run an army of thousands of fake accounts, so-called "sock puppets" that you can use for your purposes, that is a lot of work, especially if you are producing propaganda in a foreign language. So, I guess Russian actors have found a lab in Nigeria that can provide services at competitive prices. But countries like China and Russia seem to be developing an interest in political influence in many African countries, so it is possible that there is a service industry for disinformation in Nigeria for that part of the world.
DER SPIEGEL: How can we protect democracy from this globalized disinformation industry?
Howard: Any company that is listed on the New York Stock Exchange must provide a filing to the Securities and Exchange Commission (SEC) with about details of the company. Why not adopt that system for social media companies that are in the business of running open public platforms for communication? Each social media company should provide some kind of accounting statement about how it deals with misuse, with reporting hate speech, with fact checking and jury systems and so on. This system of transparency and accountability works for the stock markets, why shouldn’t it work in the social media realm? It would not be an instance of overregulation, but it would help a functioning, open, transparent market of ideas.
DER SPIEGEL: Don’t some companies already do that?
Howard: Yes, to some extent Google and Twitter und Facebook already publish some forms of reports, but they are voluntary and not very complete and each one has a different format. This should be regulated in a way that has worked well for stock markets, to prevent fraud, enhance trust, and ensure that honest players can thrive.
DER SPIEGEL: Are social media really the problem – or are they merely symptoms of deeper problems like weakening trust in institutions and a widening of social inequities?
Howard: Probably, but addressing these would require a long-term strategy. Addressing social inequality would take many years. We have to act faster. We clearly need a digital civics curriculum. The 12 to 16 year olds are developing their media attitudes now, they will be voting soon. There is very good media education in Canada or the Netherlands for example, and that is an excellent long-term strategy.
DER SPIEGEL: Can the public develop something like herd immunity against the infection with disinformation?
Howard: Herd immunity against industrialized online propaganda and disinformation is a worthy goal. But a strong mental immune system, so to speak, does not develop all by itself – we need to help it develop and become stronger, just like with a COVID vaccination. You don’t just infect people, but you give them a vaccine to slowly bring their immune system up to speed.
DER SPIEGEL: What’s the next frontier in disinformation campaigns? Artificial intelligence-driven deep fake videos that put words into the mouths of political opponents that they have never said?
Howard: Yes, this seems to be an emerging field. But so far, these deep fake videos do not seem to travel well, they are not very infectious, so to speak. The act of compressing these deep fake videos often gives them properties and a data signature that seems to be easy to catch for filter algorithms at companies like YouTube. So, in a sense there is still something like a blood-brain barrier protecting a wider audience from being bombarded with deep fake videos and AI-driven propaganda. But that may change. Maybe we will find examples of that in our next report.