Since its 2008 incursion into Georgia (if not before), there has been a remarkable evolution in Russia’s approach to propaganda. This new approach was on full display during the country’s 2014 annexation of the Crimean peninsula. It continues to be demonstrated in support of ongoing conflicts in Ukraine and Syria and in pursuit of nefarious and long-term goals in Russia’s “near abroad” and against NATO allies.
In some ways, the current Russian approach to propaganda builds on Soviet Cold War–era techniques, with an emphasis on obfuscation and on getting targets to act in the interests of the propagandist without realizing that they have done so.
...
Interestingly, several of these features run directly counter to the conventional wisdom on effective influence and communication from government or defense sources, which traditionally emphasize the importance of truth, credibility, and the avoidance of contradiction.3 Despite ignoring these traditional principles, Russia seems to have enjoyed some success under its contemporary propaganda model, either through more direct persuasion and influence or by engaging in obfuscation, confusion, and the disruption or diminution of truthful reporting and messaging.
...
In addition to manufacturing information, Russian propagandists often manufacture sources. Russian news channels, such as RT and Sputnik News, are more like a blend of infotainment and disinformation than fact-checked journalism, though their formats intentionally take the appearance of proper news programs. Russian news channels and other forms of media also misquote credible sources or cite a more credible source as the origin of a selected falsehood. For example, RT stated that blogger Brown Moses (a staunch critic of Syria’s Assad regime whose real name is Eliot Higgins) had provided analysis of footage suggesting that chemical weapon attacks on August 21, 2013, had been perpetrated by Syrian rebels. In fact, Higgins’s analysis concluded that the Syrian government was responsible for the attacks and that the footage had been faked to shift the blame.18 Similarly, several scholars and journalists, including Edward Lucas, Luke Harding, and Don Jensen, have reported that books that they did not write—and containing views clearly contrary to their own—had been published in Russian under their names. “The Kremlin’s spin machine wants to portray Russia as a besieged fortress surrounded by malevolent outsiders,” said Lucas of his misattributed volume, How the West Lost to Putin. 19
Why might this disinformation be effective? First, people are often cognitively lazy. Due to information overload (especially on the Internet), they use a number of different heuristics and shortcuts to determine whether new information is trustworthy.20 Second, people are often poor at discriminating true information from false information—or remembering that they have done so previously. The following are a few examples from the literature:
• In a phenomenon known as the “sleeper effect,” low credibility sources manifest greater persuasive impact with the passage of time. While people make initial assessments of the credibility of a source, in remembering, information is often dissociated from its source. Thus, information from a questionable source may be remembered as true, with the source forgotten.
• Information that is initially assumed valid but is later retracted or proven false can continue to shape people’s memory and influence their reasoning.
• Even when people are aware that some sources (such as political campaign rhetoric) have the potential to contain misinformation, they still show a poor ability to discriminate between information that is false and information that is correct.21 Familiar themes or messages can be appealing even if these themes and messages are false.
Information that connects with group identities or familiar narratives—or that arouses emotion—can be particularly persuasive. The literature describes the effects of this approach:
• Someone is more likely to accept information when it is consistent with other messages that the person believes to be true.
• People suffer from “confirmation bias”: They view news and opinions that confirm existing beliefs as more credible than other news and opinions, regardless of the quality of the arguments.
• Someone who is already misinformed (that is, believes something that is not true) is less likely to accept evidence that goes against those misinformed beliefs.
• People whose peer group is affected by an event are much more likely to accept conspiracy theories about that event.
• Stories or accounts that create emotional arousal in the recipient (e.g., disgust, fear, happiness) are much more likely to be passed on, whether they are true or not.
• Angry messages are more persuasive to angry audiences.22 False statements are more likely to be accepted if backed by evidence, even if that evidence is false:
• The presence of evidence can override the effects of source credibility on perceived veracity of statements.
• In courtroom simulations, witnesses who provide more details—even trivial details—are judged to be more credible.23
Finally, source credibility is often assessed based on “peripheral cues,” which may or may not conform to the reality of the situation.24 A broadcast that looks like a news broadcast, even if it is actually a propaganda broadcast, may be accorded the same degree of credibility as an actual news broadcast.25 Findings from the field of psychology show how peripheral cues can increase the credibility of propaganda:
• Peripheral cues, such as the appearance of expertise or the format of information, lead people to accept—with little reflection—that the information comes from a credible source.
• Expertise and trustworthiness are the two primary dimensions of credibility, and these qualities may be evaluated based on visual cues, such as format, appearance, or simple claims of expertise.
• Online news sites are perceived as more credible than other online formats, regardless of the veracity of the content.26