Material and commentaries published in the past may or may not be helpful in analyzing current economic or financial market activity. Please note publishing date when reviewing materials.  Please email [email protected] for our current thoughts or to reach an advisor.

 

Ukraine 5

David R. Kotok
Mon Aug 22, 2022

The fifth program in the Ukraine: What’s Next? series was held at USF Sarasota-Manatee on Wednesday, June 8, 2022. Part 5, focused on disinformation, is titled “Hearts and Minds.”
 
The program series is a collaboration among the USF Sarasota-Manatee campus, the Air Force Association Florida West Coast Chapter, Cyber Florida, the Global Interdependence Center, Security Management International, the USF Institute for Public Policy and Leadership, and USF ResearchOne.
 
The first four parts of the series are available here:

Part 1, “Military and Intelligence Insights,”
https://www.cumber.com/market-commentary/ukraine-whats-next-part-1

Part 2, “Financial and Economic Impacts,” https://www.cumber.com/market-commentary/ukraine-whats-next-part-2

Part 3, “Cyber Security Analysis,” https://www.cumber.com/market-commentary/ukraine-whats-next-part-3

Part 4, “Diplomatic and Humanitarian Aspects,” https://www.cumber.com/market-commentary/ukraine-whats-next-part-4
 
Here’s the link to the video of the Part 5 program:

 

The fifth program in the Ukraine: What’s Next? series

 

 

https://www.youtube.com/watch?v=gz-sv99796I
 
We heartily encourage readers to invest the time to watch this three-hour program on disinformation, its enormously corrosive power, and what we might do about it. Because disinformation is such a powerful and destructive tool in play in the Ukraine war and beyond, we have gone the extra mile in this commentary to relate key information and the arguments of the experts who appeared in the program.
 
After introductory remarks, Karen Holbrook, Regional Chancellor of the USF Sarasota-Manatee Campus, explains that in the course of the first four programs in the series, participants often noticed how significantly misinformation, disinformation, and propaganda affected the war in Ukraine and the world’s perception of it; so the program organizers decided to add a fifth program to focus on this crucial issue.
 
General Scott Gray, conference moderator, then summarizes the series to date, starting at 00:04:30.
 
Keynote Interview with Sue Gordon
 
At 00:14:45, Ron Sanders, Staff Director at Cyber Florida, briefly reminds us that there is a war being waged on the internet, and on social media in particular, for hearts and minds; and he then introduces Sue Gordon, Director, CACI International; former Principal Deputy Director of National Intelligence; former Director of the CIA Information Operations Center; and a person who has “spent a lot of time in the Oval Office.” Ron’s interview with Sue commences at 00:18:53.
 
Sue tells us that the “active measures” along propaganda and disinformation lines that Russia is employing in its attack on Ukraine have been part and parcel of Russian and Soviet warfighting since tsarist times. Thus, through the internet and other media, Russia was preparing the Ukrainian battlefield long before its troops entered the country. Then, since the invasion, we have seen the integration of information operations with “kinetic warfighting,” both to promote and inspire Russian actions and to counter actions and confuse Russia’s opponents.
 
Sue notes, though, that Russia’s information operations have been less effective than Russia may have expected, in part because of Ukraine’s experience with such operations and in part because of the help Ukraine has received from its allies, and especially from the US intelligence and technology sectors. She cites one particularly effective move in which the US government revealed previously classified information about what the Russians were planning.
 
Nevertheless, Sue allows, Russia’s disinformation efforts in Russia itself and in Ukraine, throughout Eastern Europe, around the globe, and here in the US, too, “have been incredibly effective for them.”
 
In response to a question from Ron, Sue explains that such actions by the Russians are not necessarily a prelude to war but are instrumental in undermining power in their adversaries.
 
Information operations are also closely tied to economic activity, Sue says. For example, there was extensive Russian involvement in Brexit.
 
Sue goes on to make the point that the foundation of democracies is truth: It’s “being able to believe in your institutions, in the information you’re getting”; so that, if the Russians create chaos through their propaganda operations, whether in Ukraine or right here in the US, “the effect of that is to undermine confidence and, actually, to contaminate the host [country].”
 
But what if a democracy is not so much a fixed set of institutions and principles laid down in founding documents and laws as it is a dynamic interplay of voices, in the present political moment, as each seeks to convince the public and establish as dominant its version of the truth? Then, our cherished democratic principles, first among them the freedom of speech, may actually leave the door open to, or even be a prerequisite for, the emergence of demagoguery or fascism. That is the argument laid down by Zac Gershberg and Sean Illing, coauthors of the new book The Paradox of Democracy: Free Speech, Open Media, and Perilous Persuasion (Univ. of Chicago Press). We will leave it readers to investigate their thesis and form their own opinions on this crucial topic.
 
So the Russians are out there, in Ukraine, in their other former satellite states, in Western Europe, here in the US, and throughout our intensely interconnected world, doing their utmost to convince everyone of their perverse “truths”  for instance, the notion that the Americans and Europeans are responsible for provoking the war in Ukraine and are therefore to blame for the social and economic disruptions the war has brought upon us.
 
And, Sue notes, the Russian disinformation campaign is powerful enough that, although the entire NATO alliance has stood up against it, “You’re starting to see that … Germany and France are talking a little bit different than the rest of NATO” as their own economic vulnerabilities are exposed.
 
“I think this is something we have to keep talking about,” Sue maintains, “because one of the real problems with this is that, as pervasive as disinformation in general and active measures and Russian involvement [are] … we so don’t talk about it that the average citizen doesn’t necessarily know that the information they are being fed is designed to shape their views….”
 
“How much of this is enabled by technology  or does technology actually hurt?” Ron asks.
 
“Well, we aren’t going to put that genie back in the bottle,” Sue rejoins; “we can’t long for a simpler time…. Technology gives you volumetric effects that you can’t get physically; you just get blooming of information you couldn’t get with leaflet drops. Two, you get a measure of stealth, because the ability … to distinguish true from not true is really hard for the average bear, especially in democracies, where you don’t want to control content….”
 
They are speaking to a key point made by Gershberg and Illing. They say, “When new forms of communications arrive, they often bolster the practices of democratic politics. But the more accessible the media of a society, the more susceptible that society is to demagoguery, distraction, and spectacle” (
https://press.uchicago.edu/ucp/books/book/chicago/P/bo146792768.html). And in the present age, when virtually everyone who owns a phone or computer has the potential to influence the entire global body politic (and can expect to be constantly impacted by other influencers), democracy may now be both more empowered and imperiled than it has ever been.
 
The tools that are now available to power players in the information wars are a quantum leap beyond what they had even a decade ago. As Sue explains, “Big data, in the hands of people intending to use information for their advantage and to shape dishonestly, can be used to actually find the fissures in society that can be exploited…. Now, with the analytics and the reach, you can target individuals….” And then you can turn your bots loose to really do some damage.
 
So these are sophisticated, politically well-coordinated, highly effective propaganda techniques and technologies. “Very problematic,” Sue admits.
 
And of course, the pandemic was, in Sue’s words, “a very interesting microcosm of disinformation around vaccines, around the spread of disease, around who was to blame.” Ron chimes in to mention the results of a study he and a couple of colleagues completed last year on Covid-19 and social media: “It turns out that the vast majority of the population doesn’t ask or listen to experts. They listen to friends, including ‘friends’ they don’t even know, because they share their point of view or what they think is their point of view.”
 
Sue rejoins with, “Yeah, it’s the whole notion of generating and exploiting conspiracy theories, and again, in a really targeted way. What I mentioned before about compromising the host; now you have this post-truth world where people have come to believe that what anyone says is as good as what someone else says; and, by the way, I may believe my neighbor more than my government  which, again, is an effect that they’re trying to generate.”
 
Thus when the US government issues a policy related, let’s say, to Covid, the intense public blowback that is instantly generated via social and mass media may be powerfully fomented by  if it is not originated by  a few ne'er-do-well characters sitting in an apartment in a suburb of Moscow.
 
As a result, we now have institutions that are less trusted. But Sue sees a hopeful, “glass-half-full” sign in the fact that the US government has been willing, in the past five years of so, to release information that exposes the activities of the Russians and others. A prime example is the decision to put out, in an unclassified fashion, the intelligence community’s assessment of the effort by Russia to influence the 2016 election. And, as Sue points out, in doing so our government ran the risk of doing the work of our adversary, by telling people that they might not be able to believe in their own elections. Nevertheless, Sue thinks this trend to counter disinformation efforts with transparency is “a really strong move.”
 
But, Sue admits, “I think our institutions need to work harder to either fix themselves so they can be trusted, or represent the fundamental trustworthiness of the institutions, because … you have a citizenry that is so ready be believe anything, and you have that being shaped by adversaries who would do us harm…. In America, the biggest threat that we face is that we won’t believe in ourselves.”   
 
Ron and Sue turn to talking about solutions to the disinformation crisis. We don’t need a “single, state-sponsored message,” says Sue  that’s antithetical to our way of life. We need a multipronged approach. We have to get both the private sector and the public deeply involved. But there may need to be at least some limited regulation of the social media platforms.
 
To wrap up the session, Ron asks Sue, “You’re in the Oval Office. You’re knee to knee with the president, and she says, ‘All right, Sue Gordon, what do I do? What does the government do?”
 
Sue responds, “I’m always about the first act I would do. I think I’d use my platform to talk to the American people, but not in response to a particular event, where people are unlikely to believe any specific approach because there’ll be so many people on different sides. But I’d talk about this issue. I’d talk about the issue as one that is most fundamental to our future. I’d talk about what’s happening…. You could present such an impressive, integrated study of how this technique has been used historically and how it’s being used today. And I’d say, I’m going to make … countering this activity the most fundamental thing I do….”
 
Panel Discussion
 
Ron then introduces, at 1:12:04, a panel discussion in which he asks the three participants many of the same questions he has just asked Sue. The three panelists are J.D. Maddox, Adjunct Professor at the George Mason Schar School, former Army Psy-Ops Operator, CIA Branch Chief, and Deputy Director of the US Global Engagement Center; Golfo Alexopoulos, Professor and Director of the USF Institute on Russia; and James Foster, Chairman and CEO of ZeroFOX, a company that protects against cyberattacks, including social media attacks. (However, Foster has an audio problem on Zoom and ends up not participating in the panel.)
 
Ron asks Golfo to lead off by telling us what is going on, on the ground, with Russian disinformation active measures, in Ukraine, Russia, Eastern Europe, Western Europe, and the US.
 
Golfo explains that the Russians have several narratives that they are pushing out, and she details them. Predictably, the narratives have been effective in Russia but not at all in Ukraine  to Putin’s surprise. Golfo echoes Sue Gordon in noting that the US was effective in releasing intelligence that countered Russian accusations of false flag operations by the West, lies that might be used by the Russians as a pretext to launch their war.
 
Next, Ron asks J.D. whether the fight for hearts and minds in Eastern Europe, Western Europe, and the US is a war of attrition. J.D. zeroes in on the effectiveness of the Ukrainian narrative concerning Russian atrocities and potential use of nuclear weapons.
 
Ron asks J.D. what he thinks of the new US strategy, described by Sue, of sharing “stuff we’ve never shared before.” J.D. points out that the US effort to call out the Russians on their impending attack of Ukraine and thereby dissuade them from attacking “did not work,” and that it’s “extraordinarily difficult to reverse a military force that is determined to conduct an attack.” However, the US effort did succeed in unifying the West against Russia, which has been critical for Ukraine’s cause.
 
Both J.D. and Golfo emphasize the significance of Ukraine’s own information operations in maintaining strong support in the West.
 
Ron observes that the US and its citizens are being buffeted by information operation campaigns. He asks, what should be the role of the US government in the information space? Should it be “benign and enabling or more proactive and aggressive”?
 
J.D. encourages more support for our journalistic institutions, in order to “express the US narrative or an unbiased narrative.” But also, he asks, how do you structure the US effort to go after the disinformation problem? He notes wryly that the US Disinformation Governance Board (DGB) “fragmented pretty quickly” (in less than one month, actually, in April–May 2022), in large part because “the American people really don’t like to be told what to think”  a central issue with respect to any attempt at governance in the information space.
 
Ironically, The Washington Post described the board as having fallen victim to “a textbook disinformation campaign” about its mission, to which it failed to adequately respond. J.D. observes that the US Dept. of Education system does not include media literary or digital literacy in its curricula requirements.
 
J.D. mentions a number of US agencies that are working hard to combat disinformation, but he says that those efforts are really very limited, and he maintains that what we need is “a whole-society capability that brings in all of the functions that the United States has to offer.” This is not a narrative-based attack on the problem, where we merely counter false narratives that are thrown at us; rather, it utilizes not only the systems that are in place in the government but also the resources of civil society, including, says J.D., the talents of Hollywood, to “really push the American image in a way that we want to push it.”
 
J.D. quickly adds the caveat, however, that “All this comes with the very difficult question of what is truth and what is our position that we are trying to defend or our image that we are trying to create in the world?” And, he admits, “It’s very, very difficult for us, as a politically organized nation, to come up with that unifying message or that central message of what the United States stands for, especially in this very chaotic political environment that we have right now.”
 
Ron jumps in to say that he is going to push back a little bit and also put Golfo on the spot. He agrees that combating disinformation requires a whole-nation effort, but he asks, who coordinates this massive effort? He then turns to Golfo and asks, “Is there a role for the US government here, or is that role to simply be as transparent as we can about stepping back and not telling people how to think?”
 
Golfo thinks the government should play a role in interacting with social media companies to control the proliferation of bots and other disinformation-spreading tools. But she agrees with Ron that a whole-society effort is urgently required to create a resilient and well-informed citizenry.
 
So, Ron says, it’s clear that in the case of Ukraine we’re in a protracted war of attrition with Russia. He asks J.D., how do we handle that? J.D. thinks we’re a bit delusional about Putin. He asserts that we need to have a “very kinetic response” to Putin’s nuclear threats and atrocities being committed in Ukraine. Golfo backs J.D. up on his concern, saying that what we have is not a war between Russia and Ukraine, it’s a war between Russia and the West; and from the Kremlin’s point of view, it’s an existential crisis and the West is to blame for it.
 
Professor Piotr Lubinsky
 
Ron next introduces Professor Piotr Lubinsky of the Krakow University Strategic Studies Institute, who has worked with NATO and the US Special Operations Command to combat disinformation coming out of Russia. Prof. Lubinsky shares “Russian Disinformation in Poland: Anti-Ukrainian Narratives,” beginning at 2:14:40.
 
In thanking Prof. Lubinsky for his presentation, Ron says, “Talk about a dose of reality! Some of the examples you gave really drive home the seriousness of the matter. Disinformation is serious stuff, and it’s part of even more serious stuff, including armed conflict…. Disinformation is simply an alternative to armed conflict. If a nation can accomplish its objectives through mis- and disinformation … and obviate the need for troops on the ground, it will. And if it can’t, then it becomes part of armed conflict. And frankly, that’s a lesson we need to learn in the West.”
 
The CyberPatriot Programs
 
At 2:37:08, to round out the program, Ron introduces Col. Stuart Pettiss, a retired Air Force officer who is now Director of STEM Education Programs for the Air and Space Force Association (AFA), and Rachel Zimmermann, Senior Director of Business Operations for CyberPatriot. They introduce us to the CyberPatriot Programs, including the National Youth Cyber Defense Competition, which aims to help interested high school students become the next generation of cybersecurity experts and STEM professionals. Last year the yearlong competition included more than 5,000 teams from all 50 states. The programs also include AFA CyberCamps, the Elementary School Cyber Education Initiative, the Cyber Education Literature Series, the Tech Caregivers Program, and the CyberGenerations Program for senior citizens.
 
To sum up, “Hearts and Minds,” Part 5 of the Ukraine: What’s Next? series, is an extremely informative and thought-provoking discussion of utmost importance to the shape our future will take, not only in Ukraine but here in the US and around the world. For a still deeper dive, please watch:
https://www.youtube.com/watch?v=gz-sv99796I

 

The fifth program in the Ukraine: What’s Next? series

 

 

David R. Kotok
Chairman & Chief Investment Officer
Email | Bio


Links to other websites or electronic media controlled or offered by Third-Parties (non-affiliates of Cumberland Advisors) are provided only as a reference and courtesy to our users. Cumberland Advisors has no control over such websites, does not recommend or endorse any opinions, ideas, products, information, or content of such sites, and makes no warranties as to the accuracy, completeness, reliability or suitability of their content. Cumberland Advisors hereby disclaims liability for any information, materials, products or services posted or offered at any of the Third-Party websites. The Third-Party may have a privacy and/or security policy different from that of Cumberland Advisors. Therefore, please refer to the specific privacy and security policies of the Third-Party when accessing their websites.


 

Sign up for our FREE Cumberland Market Commentaries

 


Cumberland Advisors Market Commentaries offer insights and analysis on upcoming, important economic issues that potentially impact global financial markets. Our team shares their thinking on global economic developments, market news and other factors that often influence investment opportunities and strategies.