• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Defense Against the Dark Arts: Networking & Propoganda: Trumpism, Russia, China, Milo

Status
Not open for further replies.

wildfire

Banned
In honor of MisinfoCon this weekend, it's time for a brain dump on propaganda — that is, getting large numbers of people to believe something for political gain. Many of my journalist and technologist colleagues have started to think about propaganda in the wake of the US election, and related issues like ”fake news" and organized trolling. My goal here is to connect this new wave of enthusiasm to history and research.

This post is about persuasion. I'm not going to spend much time on the ethics of these techniques, and even less on the question of who is actually right on any particular point. That's for another conversation. Instead, I want to talk about what works. All of these methods are just tools, and some are more just than others. Think of this as Defense Against the Dark Arts.



I'll only post the methods that work. If you want to read about the counters in order to protect yourself click the link.




Russia: You don't need to be true or consistent

Russia has a long history of organized disinformation, and their methods have evolved for the Internet era. The modern strategy has been dubbed ”the firehose of falsehood" by RAND scholar Christopher Paul.


The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quantity does indeed have a quality all its own. High volume can deliver other benefits that are relevant in the Russian propaganda context. First, high volume can consume the attention and other available bandwidth of potential audiences, drowning out competing messages. Second, high volume can overwhelm competing messages in a flood of disagreement. Third, multiple channels increase the chances that target audiences are exposed to the message. Fourth, receiving a message via multiple modes and from multiple sources increases the message's perceived credibility, especially if a disseminating source is one with which an audience member identifies.


Potential losses in credibility due to inconsistency are potentially offset by synergies with other characteristics of contemporary propaganda. As noted earlier in the discussion of multiple channels, the presentation of multiple arguments by multiple sources is more persuasive than either the presentation of multiple arguments by one source or the presentation of one argument by multiple sources. These losses can also be offset by peripheral cues that enforce perceptions of credibility, trustworthiness, or legitimacy. Even if a channel or individual propagandist changes accounts of events from one day to the next, viewers are likely to evaluate the credibility of the new account without giving too much weight to the prior, ”mistaken" account, provided that there are peripheral cues suggesting the source is credible.


China: Don't argue, distract and disrupt

The Atlantic has a readable summary of recent research by Gary King, Jennifer Pan, and Margaret E. Roberts. They started with thousands of leaked Chinese government emails where commentators report on their work, which became the raw data for an accurate predictive model of which posts are government PR. A surprising twist: nearly 60% of paid commenters will just tell you they're posting for the government when you ask them, which allowed these scholars to verify their country-wide model. But the core of the analysis is what these posters were doing.

From the paper:

We estimate that the government fabricates and posts about 448 million social media comments a year. In contrast to prior claims, we show that the Chinese regime's strategy is to avoid arguing with skeptics of the party and the government, and to not even discuss controversial issues. We infer that the goal of this massive secretive operation is instead to regularly distract the public and change the subject, as most of the these posts involve cheerleading for China, the revolutionary history of the Communist Party, or other symbols of the regime.

Note that this is only one half of the Chinese media control strategy. There is still massive censorship of political expression, especially of any post relating to organized protest, which is empirically good at toppling governments.

All of this without ever getting into an argument. This suggests that there is actually no need to engage the critics/trolls to get your message out (though it might still be worthwhile to distract and monitor them.) Just communicate positive messages to the masses while you quietly disable your detractors.


Milo: Attention by any means necessary



The key tactic of alternative or provocative figures is to leverage the size and platform of their ”not-audience" (i.e. their haters in the mainstream) to attract attention and build an actual audience. Let's say 9 out of 10 people who hear something Milo says will find it repulsive and juvenile. Because of that response rate, it's going to be hard for someone like Milo to market himself through traditional channels. His potential audience is too spread out, and doesn't have that much in common. He can't advertise, he can't find them one by one. It's just not going to scale.

But let's say he can acquire massive amounts of negative publicity by pissing off people in the media? Well now all of a sudden someone is absorbing the cost of this inefficient form of marketing for him.


Attention is the currency of networked propaganda. Attention is the key. Be very careful who you give it to, and understand how your own emotions and incentives can be exploited.


Debunking doesn't work: provide an alternative narrative


Telling people that something they've heard is wrong may be one of the most pointless things you can do. A long series of experiments shows that it rarely changes belief. Brendan Nyhan is one of the main scholars here, with a series of papers on political misinformation. This is about human psychology; we simply don't process information rationally, but instead employ a variety of heuristics and cognitive shortcuts (not necessarily maladaptive in general) that can be exploited.


The role of intelligence: Action not reaction

Posts spiked around political events (CCP Congress) and emergencies that the government would rather citizens not talk about, such as riots and a rail explosion. This ”cheerleading" propaganda wasn't simply a regular diet of good news, but a precisely controlled strategy designed to drown out undesirable narratives.

One of the problems of a free press is that ”the media" is a herd of cats. There really is no central authority — independence and diversity, huzzah! Similarly, distributed protest movements like Anonymous can be very effective for certain types of activities. But even Anonymous had central figures planning operations.

The most successful propagandists, like the most successful protest movements, are very organized.

http://jonathanstray.com/networked-propaganda-and-counter-propaganda
 
Status
Not open for further replies.
Top Bottom