Blogs

Do Not Censor This Blog

Margaret Roberts on Information Control in China (and What it Means for the Rest of the World)

Print Page

Margaret Roberts is a CEGA Affiliate and Assistant Professor of Political Science at the University of California, San Diego. In this interview, she discusses her new book, Censored: Distraction and Diversion Inside China’s Great Firewall, with Andrew Westbury, Senior Program Manager for Agriculture and Institutions at CEGA. Earlier this month, Roberts kicked off “Evidence Watch,” a new webinar series co-hosted by CEGA and the Transparency and Accountability Initiative, with a discussion of this work.


Over the past year, industry, governments, and philanthropists have grappled with how to address online propaganda, fake news, and political disinformation. Interventions like fact-checking and news literacy programs are gaining traction, there is concern they don’t offer a comprehensive solution.

A major obstacle to progress is that we don’t yet understand the full extent or impacts of the problem. A recent review of scientific evidence on these issues identifies some pretty basic questions among its “key research gaps” — what are the effects of exposure to information and disinformation on individual beliefs and behavior? How do the spread and the effect of disinformation differ across countries?

Fortunately, a number of exciting new initiatives — including Facebook’s elections initiative and other investments in research on digital disinformation — are now seeking to fill these gaps. The work of CEGA affiliate Margaret Roberts, Assistant Professor of Political Science at the University of California, San Diego, suggests that solutions may be out there.

 

Robert’s new book Censored: Distraction and Diversion Inside China’s Great Firewall documents her nearly 10-year effort to explore and understand the largest and most sophisticated effort at information control in human history. Not only does she document China’s internet censorship playbook; she provides rare evidence as to how these strategies influence political discourse, citizen behavior, and information access.

 

For those hoping to promote the open and free exchange of information, the conclusions of Robert’s book may concern you: China’s use of friction(slowing down or obstructing the spread of information), fear (threats or punishments), and flooding (coordinated campaigns to distract, confuse, or propagandize) to control information affect millions of internet users throughout the country. However, there are important gaps in the system: 1) observable instances of censorship can actually create more interest in off-limits subjects, and 2) citizens are motivated to overcome friction in moments of crisis.

I spoke with Molly to find out what we can learn from the Chinese example, and how her work can inform the protection of open and dynamic digital spaces. Our conversation is below.

AW: How does your work in China inform efforts to promote open digital dialogue?

MR: First, China has become the model for Internet censorship for many other authoritarian regimes. The government has developed one of the most sophisticated systems for information control, one that countries around the world have sought to emulate. Understanding how censorship works in China is essential to understanding and predicting how government control over the digital sphere will play out in countries that are in the process of adopting the same strategies.

But more than just how authoritarian regimes function, China also has huge sway over companies and markets in democracies. Because its market is so attractive, Internet companies, academic publishing, even Hollywood is influenced by its censorship regulations. These regulations and pressures then affect consumers outside of China. As China becomes increasingly central to the world economy, its censorship system will begin to influence content that people see around the globe.

AW: You refer to “friction” as a strategy for controlling information by increasing the costs and time required to access or share information. In your book, you argue that friction influences people’s behavior and online dialogue. Yet it seems to be largely absent from public dialogues about the “information problem,” despite the repeal of net neutrality in the U.S. and the growth of social media restrictions in East Africa. How severe and pervasive is information “friction,” and how might we address it?

MR: The Internet has vastly expanded the amount of content that people can view easily. But consumers of information have limited time to read information. What information is easiest to access strongly affects what they will interact with. Censorship by friction on the Internet occurs when governments block content, reorder search results, or just throttle websites. All of these methods can have major impacts on information consumption — even if they are easy to get around — because people are not aware that they are being manipulated, and because people are impatient with content online and are very affected by speed and ordering online.

Of course, some information has to be at the top of the news feed or at the top of a search. But as consumers of information, we should be very interested in and concerned about why certain information is easier or quicker to access, and who is deciding what information is given priority. When the government begins to dictate what information is easier to access, as many are doing around the world, it quickly slips into censorship. In many countries, users may not even be aware that their information consumption is affected by government policies. Research that studies government strategies for ordering information, compares how platforms behave across countries, and uncovers the types of information that is blocked and throttled by country is imperative to making consumers of information knowledgeable and aware of how their information is prioritized to them.

AW: In Censored, you provide rare evidence showing how “flooding” (coordinated campaigns to distract, confuse, or propagandize) influences online dialogue, and how China’s notorious “Fifty Cent Party” (online propagandists) prefers non-argumentative praise, inspirational quotes or slogans over criticism or taunting as mechanisms to influence public opinion. What would it take to study flooding outside of China, and do how do you think approaches may would differ?

MR: As it has become easier for governments to hire large numbers of people or program large numbers of bots to add information to the Internet, flooding has become an increasingly popular strategy for influencing what content people see online. We’ve seen evidence of flooding in countries all over the world, from China to South Korea to Russia. I describe in the book and in this paper that we were able to show how the Chinese government uses online astroturfers to distract from online events by using leaked “ground truth” data. In most cases, flooding is difficult to study because we cannot tell who is working for a government and who is stating their genuine opinion. As I describe in the book, in cases where we don’t have ground truth data, flooding can sometimes be studied through identifying unusually coordinated social media posts that could only be possible through synchronization. While China has its own strategy of flooding that involves “cheerleading” for the government rather than engaging with critics, collecting many instances of flooding cross-nationally and comparing them would make a huge contribution to the field.


The post appeared first on the CEGA blog