Covert Action in the Age of Social Media

Editor's Note: This article originally appeared in the Summer/Fall 2017 edition of the Georgetown Journal of International Affairs (Volume 18, No. 2), available for purchase from Georgetown University Press. For a PDF of this article, click here.

Although most countries conduct covert action operations, Russia is particularly well suited in historical, technical, and strategic terms to perform successful influence campaigns in the Cyber Age.

In the world of intelligence, few techniques go away with the passage of time. Human espionage, practiced since ancient times, and order of battle analysis, used to win the Battle of Gettysburg, have been enduring tools of intelligence work for centuries. At the same time, new techniques are constantly being invented. Since World War II, these have included satellite reconnaissance, data mining, and a host of so-called structured analytic techniques. Presently, we are seeing the development of a new mode of covert action. While there is much that we still do not know, it seems clear that its leading practitioner is Russia. Because of its size, technical prowess, and political and intelligence cultures, Russia is well positioned to exploit opportunities for covert influence offered by the Internet and social media. Specifically, it seems to be developing powerful covert influence capabilities that enable it to shape the perceptions of broad swathes of a population and thus influence elections or bring pressure to bear on democratic governments.

This article describes this new form of influence operations. After defining covert action—of which influence operations are a particular example—I will discuss Russian intelligence strategy and suggest that Russia is using modern tools to pursue an influence campaign—a form of covert action—that has its genesis in the pre-cyber age. While there is no reason to think that the United States or western services could not employ similar approaches now or in the future, this report discusses the historical, technical, and strategic factors that make Russia particularly well suited to carry out this type of operation. I close with some thoughts on how the private sector and government can thwart such influence campaigns.

Defining Covert Action

Most countries of any stature on the global stage conduct covert action, though that specific term is American. The British, for instance, call it “special political action,” while the Russians call it “active measures.” Regardless of its name, a covert action is something done to produce an effect in the world while obscuring who is responsible for it. Americans tend to think of covert action as being intrinsically lethal, having in mind the 1953 coup against Mossadeq in Iran (mounted with the cooperation of Britain’s MI6) or the Bay of Pigs invasion. However, covert action can also be the non-lethal application of influence, as happened when the CIA propped up non-Communist political parties during the 1948 election campaign in Italy. The Soviet KGB also had an extensive track record of conducting covert influence campaigns, as when it circulated a forged US Army field manual indicating that the American military was conducting false-flag terrorist attacks in allied countries.1

The element that divides covert action from other government actions is plausible deniability: the ability of authorities to deny knowledge of, or responsibility for, any action committed by their agents, often involving the implicit offering of an alternate theory of responsibility. For instance, in the late 1970s, when the Bulgarian service wanted to assassinate dissident Georgi Markov, then resident in London, it apparently got a Danish citizen of Italian background to do the job.2 More recently, the United States Intelligence Community opinion has concluded that the 2016 theft of emails from the Democratic National Committee and their provision to Wikileaks was the work of Russian intelligence. However, the public face of this has been an entity calling itself “Guccifer 2.0,” who claims to be Romanian, denies any connection with Russian intelligence, and purports to detest Russian foreign policy.3

The Challenge of Influencing Populations in the Twentieth Century

It has long been possible—though not necessarily easy—to influence the perceptions and actions of small numbers of people. A shining example occurred during World War II. In the early days of the conflict, British intelligence (with some help from its Allies) launched an open-ended strategic deception campaign intended to keep the German military both off-balance and ignorant of the real strength and intentions of the Allies. This effort continued all the way to the end of the war and helped protect many Allied operations with a “bodyguard of lies,” in the famous words of Winston Churchill.

As one part of this broader effort, the Allies conducted a massive deception effort, FORTITUDE SOUTH, during the run-up to the June 1944 Allied invasion of Normandy. That operation aimed to influence a small group of German officers and decision-makers on a relatively discrete question: how should Germany employ its defensive forces along the coast of France? Before the D-Day invasion, the British were able to create a picture in the minds of the German intelligence officers of what the Allies meant to do. That picture, of course, was false, intended to convince the Germans that the Allies were not going to invade Normandy, and if they did, that it would merely be a feint to distract from the main effort at the Pas de Calais. In order to do this, the British used numerous double agents to send a potent mixture of true and false data to the Germans. They also transmitted deceptive radio signals so that German signals intelligence stations could pick them up. They passed misleading rumors through diplomatic channels to neutral diplomats in hopes that they would make their way to the Germans. Allied military leaders, notably General George S. Patton, made publicized appearances at times and places in order to bolster the deception. The British planted fake news stories, had soldiers wear patches of non-existent military units in places where German spies might see them, and undertook a host of other such efforts.

Intelligence services are now capable of influencing large populations while, in some circumstances, using relatively few resources.

Though the Germans did not pick up every signal the Allies sent and did not necessarily believe everything they saw, the totality of the Allied effort led German intelligence officers to see convincing information about the Allied strength and intentions everywhere they looked. In constructing a mosaic of truth and falsehoods, the Allies implanted a false belief about their plans in the minds of the German high command. This deception campaign led German leaders to inadvertently help the Allied cause.4

However, during both World War II and the Cold War, it was much more difficult to influence the perceptions and actions of broad populations. The Allies, for instance, put a great deal of effort into undermining the morale of the German population through strategies ranging from radio broadcasts and counterfeit German postage stamps mocking Hitler to aerial bombing campaigns. These efforts, though, had little effect because influencing the political views and preferences of millions of people simultaneously was not possible. In the Cold War, for example, a single article planted in a mainstream newspaper or magazine was a victory for the KGB. So, too, was gaining covert control of a left-wing newspaper or magazine with a circulation of a few thousand. Even when the Soviets had such successes, the population it was trying to influence was still constantly surrounded by information coming from ideologically different sources. As Andrew Weisburd, J. M. Berger, and Clint Watts recently noted, “The KGB’s Cold War efforts to [influence Western media] . . . bore significant financial costs while producing little quantifiable benefit. Stories were difficult to place in mainstream media outlets, and the slow process made it challenging to create momentum behind any one theme.”5

Creating Alternate Realities in the Twenty-first Century: Using Lessons from the Past

Intelligence services are now capable of influencing large populations while, in some circumstances, using relatively few resources. This is largely because many politically-interested Americans (as well as others in the Western world) live in their own political realities and choose to only access information that is favorable to their worldview. Acknowledging this new behavioral tendency, a sophisticated actor like Russia can reinforce or even help create those realities, thereby influencing the political behavior of broad swathes of a population. The consequences of such actions can be significant in the political realm, potentially swaying an election, pressuring a democratic government to change policies, or simply generally undermining the consent of the governed.6

The events of 2016 and 2017 illustrate this argument. In early 2017, the United States intelligence community concluded that among Russia’s covert goals vis-à-vis the United States was to “undermine the U.S.-led liberal order” around the world, as well as to destabilize the presidential candidacy of Hillary Clinton and support that of Donald Trump.7 Recent scholarship, moreover, suggests that Russia has launched a similar active measures effort against Sweden, in a case involving significant forgeries of embarrassing documents.8 There has been additional speculation that Russia may already be mounting operations against the 2017 German and French elections.9

A general rule of covert action is that it is most likely to succeed when there is a genuine constituency with consonant goals to work with and support in the target country. The lack of such a constituency, for instance, is one of the main reasons that the CIA’s efforts to prevent the inauguration of Salvador Allende in 1970 failed.10 On the other hand, the presence of such a constituency helped make the CIA’s support for the non-Communist Italian parties in 1948 and its support for the Afghan resistance to the Soviets in the 1980s successful.

Enabled by the Internet and social media such as Facebook and Twitter, Americans today have divided themselves into vocal and activist political factions.11 Put another way, the American political scene provides numerous constituencies of entirely loyal citizens that a foreign power such as Russia can reinforce in order to serve its own interests. Again quoting Weisburd, Berger, and Watts: “without stepping foot in America, Russia’s coordinated hackers, honeypots, and hecklers [can] influence Americans through people-to-people engagement.”12 Given the existing divisions, an actor set on influencing American popular opinion would only need to determine which constituencies should be supported before acting covertly on them. The covert provision of support to such constituencies (be they witting or unwitting) is the kind of undertaking at which modern intelligence services excel. In the case of Russia, providing support would certainly be a question of national policy decided by President Putin and his closest advisers. Based on the nature of what was leaked, it appears that the Bernie Sanders and Donald Trump constituencies proved to be most useful to Moscow.13

Facts are often easier to generate than lies—they can also be the sugar that helps the falsehoods go down.

It is often easiest and most effective for intelligence services to feed a mixture of truth and falsehoods to its target constituency, just as the British did in the case of the D-Day deception. Facts are often easier to generate than lies—they can also be the sugar that helps the falsehoods go down. Cyber espionage can generate huge amounts of genuine data in short periods of time (far more than could be stolen in the pre-Internet era), allowing cyber operations to provide an excellent means of finding congenial true facts. This data can then be inserted into the news feeds of the appropriate political constituency through cutouts to maintain plausible deniability. It appears that Russian intelligence used Guccifer 2.0,, and WikiLeaks in such a way to make political hay out of the emails stolen in 2016 from the Democratic National Committee, the Democratic Congressional Campaign Committee, as well as from Hillary Clinton’s campaign manager, John Podesta, and Colin Powell.14

It is also not difficult—though labor intensive—to manipulate stolen data or to forge wholly false data if there is genuine data to use as a model. Though there is presently only fragmentary data to suggest that the Russian services may have done this as part of their campaign to influence the 2016 presidential campaign, Soviet services are known to have forged documents to influence American politics during the Cold War.15 As mentioned above, they also seem to have recently used forgeries in Sweden.16

Russia, like states with similar capabilities, has other tools at its disposal to inject information and disinformation into the fragmented American political discourse. The Russian government can use its official news outlets such as RT or Sputnik to promulgate fake or misleading news stories.17 It can also use official Russian government statements and actions to create news, for instance with public statements by Putin or by military deployments. It is also possible for Russia to generate fake or misleading social media content out of whole cloth.18 Interestingly, Russian influence efforts seem to have paid little attention to internal consistency, which represents an intriguing departure from the British deception efforts during World War II.19 The Russians may be banking on the tendency of readers to seek out information that fits their preconceived ideas and reject all other data by providing multiple analytic (albeit false) paths to the conclusion that it wants the readers to reach.

Russia, like states with similar capabilities, has other tools at its disposal to inject information and disinformation into the fragmented American political discourse.

Bots and trolls play an important role in influencing the readers’ conclusions, mostly by providing large volumes of information and making it more psychologically believable.20 While thousands of bots can be used to promulgate content of any kind, Russia’s vast armies of Internet trolls can police the entire process by shouting down and intimidating dissenters and by disseminating and re-disseminating stories, keeping the echo chamber resounding.21

In short, it is now possible for Westerners by the millions to live in hermetically sealed political realities. Often these are of their own creation but they can also be shaped or even created by Russia to serve Russian interests.

What Enables Russia to Do This?

Much is unknown about precisely what Russia is doing and what the dividing line is between Russian efforts and those by other political entities or actors. Nevertheless, Russia is a logical country to pioneer these techniques, and there is a great deal of evidence that it is doing so. Russia is leading the pack in this field for several reasons.

First, Russia—both on its own and from its predecessor, the Soviet Union—has an extensive tradition of covert operations. In short, Russia is good at covert influence and is building on an existing strength.

Second, as the recent studies and reportage on APT28 (FANCY BEAR) and APT29 (COZY BEAR) indicate, Russia is a leader in the field of cyber espionage, the most productive way of generating true or alterable content to feed a covert influence campaign.22

Third, Russia traditionally understands intelligence as a tool of national power in its own right. The symbol of Soviet intelligence was a sword and a shield, indicating that intelligence existed to defend the country (or, more properly, the Communist Party of the Soviet Union) and to smite its enemies. The KGB lexicon dating to the 1970s defines intelligence as “a secret form of political struggle.”23 The understanding of intelligence that it represents has endured in post-Soviet Russia. As a result, it follows that Russia might be willing to expose one of its sensitive intelligence sources or methods if doing so produces a strategic payoff.

By contrast, the United States and most Western countries see intelligence primarily as an aid to sound decision-making—not a form of power—so covert action is usually wedged awkwardly into definitions that are otherwise about the acquisition and use of information.24 The seal of the CIA represents this philosophy, portraying a sharp-eyed eagle above a shield emblazoned with a compass rose, symbolically indicating that the agency exists to protect the United States by providing knowledge of what is going on around the world. It goes against the cultural grain for the US intelligence community to blow its sources and methods by releasing secretly acquired intelligence.25 As a result, American intelligence officials generally consider it better to retain such information to provide and maintain a comparative advantage for the United States in decision-making.

Fourth, the Russian government has the flexibility to promulgate whatever “facts” or political messages seem to fit its immediate need. This, too, is probably a legacy of the communist Soviet period. When history has an intrinsic directionality, as communist doctrine assured the believers it did, there is no need to tell the truth—your side will win in any event, and the only question is how quickly. The United States is quite a different case. Though there have certainly been occasions on which the US government has lied or promulgated disinformation, in general, the United States’ greater emphasis on transparency and administration—supported by a free press—does not allow for as much promulgation of disinformation as in Russia. A good example of this is the injunction to Radio Free Europe that it must provide straight reportage.26 In addition, there are powerful imperatives for telling the truth rooted in American political culture and law. For instance, it is illegal for the US government to conduct disinformation operations against the American people; hence it must take careful steps to isolate any such efforts from exposure to the US media, a very difficult task in the modern information environment.

Russia traditionally understands intelligence as a tool of national power in its own right.

A recent report from the RAND Corporation has aptly referred to Russia’s “fire hose of falsehood.”27 This highlights the fifth point: to the extent that these techniques depend on volume—producing a large volume of content and fielding large armies of Internet trolls—Russia, as a large and technologically sophisticated country, has advantages over most other states.

Though Russia is pioneering this new form of covert influence operation and leading the field of cyberespionage, other international actors also possess such capabilities and may undertake similar operations in the future. Just like Russia, some other countries have an action-oriented intelligence culture, a disregard for the truth, and the size to practice these techniques on a large scale. It could be argued that China, notably, is practicing a variant of this social media-enabled covert influence technique but focusing it internally and emphasizing the blocking of external content.28 Other entities such as Turkey, with its vigorous Internet-based defense of Ataturk, or the Sunni jihadist community, might be able to conduct more narrowly-focused versions of this technique as well, though the jihadists will not be able to support such operations with extensive espionage operations as Russia has done. Indeed, ISIS has already made use of bots to spread its message.29

The Way Ahead

All this notwithstanding, there is good reason not to panic. To begin with, it is too early to say how powerful these new techniques really are. They can certainly produce much Sturm und Drang, but at this time it is impossible to attribute any specific political outcome to them. It is equally certain that these techniques will not be a silver bullet for autocrats and other rival powers. Perhaps such measures can swing elections, though the results of the 2016 American presidential election, with its more than two million popular vote margin in Hillary Clinton’s favor, is scarcely evidence of that. However, an indirect strategy for influencing political leaders in democratic states runs up against the fact that even democratic leaders do not always respond to the demands of their constituents, let alone a minority of their voters. Consider, for instance, George W. Bush’s decision to invade Iraq in the face of virulent opposition from much of the American public.

Additionally, it is a hopeful sign that the flurry of activity in 2016 has been reliably attributed to Russia. While some Americans do minimize or deny the involvement of Russian cyber espionage in the 2016 presidential campaign, a far more common response has been to acknowledge while still paying greater attention to the alleged malfeasance revealed in the leaked emails. This lack of plausible deniability will not give comfort to the Russian officers responsible for planning future active measures campaigns.

Finally, the histories of intelligence and war both show that for every measure there is a countermeasure. In this case, a raft of partial countermeasures will probably be necessary. Both the private sector and the government have roles to play.

Given the freedom of the press and the fact that so much of America’s cyber infrastructure is privately owned, the private sector effort will be indispensable. Part of the solution will be outside the realm of governmental action. Social media outlets should do serious research into the best ways of preventing their platforms from being used to spread fake and misleading “news.” Facebook’s Mark Zuckerberg has already announced the beginning of such an effort at his company.30 The research community also has a role to play in developing technical solutions.

Responsible media outlets should consider the ethical implications of printing news known or reasonably suspected to be provided by hostile intelligence services. It may be worth formally making the practice unethical, given that some analogous norms—and even laws—already exist. In the United States, for instance, there are norms against so-called checkbook journalism. Moreover, any American editor who knowingly took direction from a foreign government without disclosing that fact would probably be in violation of the Foreign Agents Registration Act. In another domain, evidence acquired illegally cannot be used in court in the United States even if there is no reason to doubt its authenticity. Increasing cybersecurity in the private sector, of course, is an absolute necessity.

The government can also contribute to the nation’s defenses against future influence campaigns. Of course, a renewed commitment to cybersecurity will also be important. There may be a useful role for cyber action against troll farms or foreign purveyors of false news stories. Finally, deterrence by punishment may play a role here. The still immature art of cyber deterrence may generate useful insights in how to control this new form of covert influence. It may be, for instance, that tit-for-tat retaliation is the best route. Alternately, less precise, more shadowy responses may also be effective. The Director of National Intelligence could convoke the Russian rezident in Washington and threaten unspecified consequences. The Russians could then imagine the next major American cyber operation they uncover to be American retaliation and hopefully deter themselves.

It cannot be denied that the United States and other democratic states face a major new challenge from this form of covert action. However, in the world of intelligence and covert action there is no permanent killer app.

Dr. Mark Stout is the program director of the MA in Global Security Studies at Johns Hopkins University. He has published articles in Intelligence and National SecurityStudies in IntelligenceThe Journal of Strategic Studies, and Studies in Conflict and Terrorism, and is currently working on a book on American intelligence during World War I.


1. Department of State, “Misinformation about ‘Gladio/Stay Behind’ Networks Resurfaces,” Internet,

2. Matthew Brunwasser, “A Book Peels Back Some Layers of a Cold War Mystery,” Internet,

3. Lorenzo Franceschi-Biccherai, “We Spoke to DNC Hacker ‘Guccifer 2.0,’” Internet,

4. The best accounts of this are in Michael Howard, Strategic Deception in the Second World War (London: Pimlico, 1992), and Thaddeus Holt’s The Deceivers: Allied Military Deception in the Second World War (New York: Scribner, 2004).

5. Andrew Weisburd, J.M. Berger, and Clint Watts, “Trolling for Trump: How Russia is Trying to Destroy our Democracy,” Internet,

6. Ibid.

7. ODNI, Assessing Russian Intentions and Activities.

8. Martin Kragh and Sebastian Åsberg, “Russia’s Strategy for Influence through Public Diplomacy and Active Measures: The Swedish Case,” Journal of Strategic Studies, DOI: 10.1080/01402390.2016.1273830, 2017, 1–44. See also, “Concern over Barrage of Fake Russian News in Sweden,” The Local, July 27, 2016,; Marcin Andrzej Piotrowski, “The Swedish Counter-Intelligence Report on Hostile Russian Activities in the Region in a Comparative Context,” The CSS Blog Network, CSS ETH Zurich, April 19, 2016,

9. Caroline Copley, “Spy Chief Adds to Warnings of Russian Cyber Attacks on Germany,” Internet,; Reuters, “Germany Investigating Unprecedented Spread of Fake News Online,” Internet, Nicholas Hirst, “France Braces for Election Cyberattacks,” Internet, See also; Marcin Andrzej Piotrowski, “The Swedish Counter-Intelligence Report on Hostile Russian Activities in the Region in a Comparative Context,” Internet,

10. Kristian C. Gustafson, “CIA Machinations in Chile in 1970,” Studies in Intelligence, 47:3 (2003), See also, Kristian Gustafson, Hostile Intent: U.S. Covert Operations in Chile, 1964–1974 (Washington: Potomac Books, 2007).

11. An excellent resource to illustrate this point is the Wall Street Journal’s “Blue Feed, Red Feed” website:

12. Weisburd, Berger, and Watts.

13. ODNI, Assessing Russian Intentions.

14. Department of Homeland Security, Joint Statement from the Department Of Homeland Security and Office of the Director of National Intelligence on Election Security,” Internet, Eric Geller, “Russian Hackers Infiltrated Podesta’s Email, Security Firm Says,” Internet, “Threat Group-4127 Targets Hillary Clinton Presidential Campaign,” Internet, Dmitri Alperovitch, “Bears in the Midst: Intrusion into the Democratic National Committee,” Internet, “FANCY BEAR Has an (IT) Itch They Can’t Scratch,” Internet, Eric Lichtblau and Eric Schmitt, “Hack of Democrats’ Accounts Was Wider than Believed, Experts Say,” Internet, Lorenzo Franceschi-Biccherai, “How Hackers Broke Into John Podesta and Colin Powell’s Gmail Accounts,” Internet, Thomas Rid, “How Russia Pulled Off the Biggest Election Hack in U.S. History,” Internet,

15. In November 2016 the FBI was investigated forged documents apparently intended to harm the Clinton campaign. The provenance of these documents was not clear, but at least one made its way into the social media domain. Mark Hosenball, “FBI Examining Fake Documents Targeting Clinton Campaign: Sources,” Internet, For examples of Soviet forgeries involving the 1976 American election, see Christopher Andrew and Vasili Mitrokhin, The Sword and the Shield: The Mitrokhin Archive and the Secret History of the KGB (New York: Basic Books, 1999), 240–241.

16. Kragh and Åsberg, “Russia’s Strategy for Influence,” 35, 42–44.

17. Neil MacFarquhar, “A Powerful Russian Weapon: The Spread of False Stories,” New York Times, August 28, 2016,

18. For two example of how such content can be generated, albeit in cases not involving Russia, see Terrence McCoy, “For the ‘New Yellow Journalists,’ Opportunity Comes in Clicks and Bucks,” Internet,; and Craig Silverman and Lawrence Alexander, “How Teens In The Balkans Are Duping Trump Supporters With Fake News,” Internet,

19. Christopher Paul and Miriam Matthews, “The Russian ‘Firehose of Falsehood’ Propaganda Model: Why It Might Work and Options to Counter It,” Perspective, PE-198-OSD, RAND Corporation, 2016.

20. Paul and Matthews, “The Russian ‘Firehose of Falsehood.’”

21. “Cracking the stealth political influence of bots,” Internet, Weisburd, Berger, and Watts.

22. FireEye, APT28: At the Center of the Storm: Russia Strategically Evolves Its Cyber Operations, Special Report, 2017. Dmitri Alperovitch, “Bears in the Midst: Intrusion into the Democratic National Committee,” Internet,

23. KGB Lexicon:  The Soviet Intelligence Officer’s Handbook, ed. Vasiliy Mitrokhin (Portland: Frank Cass, 2002), 111.

24. The governmental and scholarly literature in the Anglosphere on the definition and purposes of intelligence is vast and there are differences even among nations generally considered as close intelligence relatives. Mark Lowenthal’s definition is probably the most representative of the understanding of the term among American practitioners: “intelligence is the process by which specific types of information important to national security are requested, collected, analyzed, and provided to policy makers; the products of that process; the safeguarding of these processes and this information by counterintelligence activities; and the carrying out of operations.” Mark Lowenthal, Intelligence from Secrets to Policy, Seventh Edition (Thousand Oaks: CQ Press, 2016), 10.

25. Nevertheless, the United States Intelligence Community has occasionally released sensitive intelligence information to influence public debates. Examples include the release of aerial reconnaissance photographs during the Cuban Missile Crisis, the release of a small amount of signals intelligence in 1983 after the shootdown of KAL 007, and in 2003 at Secretary of State Colin Powell’s speech to the United Nations Security Council that was intended to sway the Council’s vote on the Iraq crisis.

26. Robert T. Holt, Radio Free Europe (Minneapolis: University of Minnesota Press, 1958), 24–25.

27. Paul and Matthews, “The Russian ‘Firehose of Falsehood.’”

28. James van de Velde,

29. Robert J. Bunker, “The Use of Social Media Bots and Automated (AI Based) Text Generators: Key Technologies in Winning the Propaganda War Against Islamic State/Daesh?,” Internet,

30. Mark Zuckerberg, Internet,