If the information that we receive is going to be “curated” by the federal government, we should pay attention to the views of the curator. With that in mind, I expended ten bucks for the Kindle version of Nina Jankowicz’s book How to Lose the Information War. It’s easy to see why Biden minions wanted its author on their information control team. For the past several years, she has engaged in research on Russian disinformation campaigns. Her book includes chapters on the responses of the United States, Estonia, Georgia, Poland, Ukraine and the Czech Republic, responses that she judges to have been sadly inadequate. So she brings a plausible aura of expertise to her new post as executive director of the Department of Homeland Security’s Disinformation Governance Board (a name that could have been coined only by some bureaucrat who privately loathes it).
She has a second qualification that comes through as one browses through the book. She is a one-dimensional political partisan: There is a “near-complete lack of evidence” for “alleged anti-conservative bias on the [social media] platforms.” [p. 130] It is a “falsehood” to say “that the Biden family engaged in corrupt activity in Ukraine”. [p.149] In what is clearly a sub silentio allusion to the New York Post’s coverage of Hunter Biden’s “laptop from hell”, she commends Twitter for “no longer allow[ing] the distribution of hacked materials” (the ex post facto excuse for blocking the Post reporting). [p. 130] The Trump Administration enacted a “travel ban on people from predominantly Muslim countries”. [p. 22] The Mueller Report is mentioned but not its most dramatic disinformation-related finding, that the “Steele Dossier”, the bludgeon that Democrats wielded for three years against Donald Trump, was a work of fiction. [pp. 23 and 31] And so on.
So what prescription does Tsarina Nina have for disinformation? After six chapters of case studies, Chapter 7 deals with remedies. She quotes testimony that she delivered before a Senate committee in 2018 regarding what will not suffice:
Even if the United States Government were to acknowledge the threat posed by Russian influence campaigns today in no uncertain terms, and we were to walk out of the hearing room and secure beyond a shadow of a doubt the country’s election infrastructure; even if we hermetically sealed our information environment from inauthentic users and false or misleading information, and if social media companies finally put forth a good faith effort to put users and the security of our democracy first; even then, we would still not successfully dispel the threat our democracy faces from malign actors’ political influence operations. [p. 137]
She goes on to say –
These short-term solutions do not address the underlying societal fissures that left us vulnerable to information operations in the first place. As Estonia, Georgia, Poland, Ukraine, and the Czech Republic have learned, it’s past time to begin a generational investment to build resilience within populations and target the root causes of our weaknesses. [p. 139]
She has already observed that –
many of the countries I’ve profiled here still believe that good communications or the establishment of a compelling narrative is the key to winning the information war. They’re not alone. This sentiment is popular in the West as well. It is misguided, but persistent. [p. 134]
The “generational investment” is not, then, to go toward expounding truth and countering falsehood. The Tsarina and I almost agree here. Government investment in “narratives” generally has a low ROI, particularly where citizens are left free to dispute the narrative. That is as far as our agreement goes.
The first, near-term element of the Tsarina’s program is –
to set the guardrails of the internet and the social media platforms on which we share so much of our lives.
That begins with setting clear and unified definitions of not only the rules of the road, but the vehicles on them. . . . Imagine how difficult driving would become if neighboring states or countries defined simple concepts like right of way or speed limits differently. This is essentially how social media platforms operate today. Each has its own definition of disinformation, of hate speech, and of targeted abuse, among other concepts, and each deals with each of those concepts differently. [p. 139]
While conceding that “we are hesitant to allow government bodies jurisdiction over our right to free speech”, she justifies the intrusion by the fact that “social media platforms are highly curated, highly addictive content farms that have total control over what we see and how we communicate; it’s time they do it in a way that promotes the greater democratic good and not just their bottom line”. [pp. 139-40] The same phenomenon – “control over what we see and how we communicate” – is often cited by right-wing critics of social media. The difference is that the latter argue, rightly or wrongly, for limiting the ability of the platforms to block content that they don’t like. The Tsarina wants to expand social media censorship and give the government control over how the censors operate.
Phase two of the Tsarina’s plan is a great expansion of government-funded media.
I hope to one day see an America that invests more in journalism as a public good, not only through the variety of philanthropic initiatives to support the industry that have proliferated since 2016, but through further government investment in public broadcasters. In 2018, the US government allocated $447 million to the Corporation for Public Broadcasting, which funds the Public Broadcasting Station and National Public Radio, as well as their local affiliates. In comparison, the British Broadcasting Corporation’s budget for the same year was over $6 billion, despite the fact that Britain’s GDP is about a fifth of the size of the United States’. [p. 141]
She laments that the BBC didn’t save Britain from Brexit but happily notes that 57 percent of the British public regards it as the country’s most trustworthy news source (at least according to the BBC survey that she cites). She continues –
Investments in journalism as a public good are critical to maintaining a healthy information environment in which disinformation can more easily be dispelled, and in which a trusted voice is readily accessible in times of chaos and turmoil. [p. 142]
Put slightly differently, the U.S. government needs a propaganda arm and has been remiss about funding it. That need is all the more acute, because –
In New Jersey [where the Tsarina grew up], no one is ever very far from several bustling metropolitan areas, and the coverage afforded by any one of those cities’ newspapers or radio and television stations is applicable and useful for daily life. But imagine a similar situation in the nation’s heartland, and you begin to understand why so many people are flocking to dubious websites that replicate their worldview; the connective tissue between the local and the national or international has been lost. [p. 141]
Still, social media “rules of the road” and tax-funded subventions to friendly newspapers and TV stations aren’t enough.
Even with greater investments in quality journalism, people will still need help navigating the unending flow of information that characterizes the modern media environment, as I found in every country I visited over the course of researching this book. In Estonia, education and outreach form the backbone of the government’s renewed efforts to build an inclusive Estonian identity that is resilient to outside influence. As Irene Käossar, the head of Estonia’s Integration Foundation, told me, the government is no longer trying to express “why” ethnic Russians should pursue citizenship or learn Estonian language; instead, the onus is on the government to figure out how to “change thinking.” [p. 142]
Estonia is a tiny country threatened by a huge, hostile neighbor that conquered it once and shows many signs of seeking to absorb it again. An unassimilated Russian minority is a threat to its very existence. I will wager a good deal, though, that Nina Jankowicz would find Estonia’s policy abhorrent if pursued by a country with which she didn’t sympathize. Note that she has taken an initiative to persuade resident aliens to “pursue citizenship or learn Estonian language” as a template for a government program to furnish “help navigating the unending flow of information that characterizes the modern media environment”. These things are not quite like one another.
The nature of the “help” that the government is to provide and how it is to be inculcated are the vaguest parts of the Tsarina’s plan. She complains that the social media platforms have made only perfunctory attempts to educate their users:
So far, efforts to raise awareness of media and digital literacy best practices on the platforms have been lackluster; in its “Tips to Spot False News,” which is available online in the “Help Center” and has run in several of the world's major newspapers (arguably not the audience that most needs assistance spotting “false news”), Facebook even writes “reliance on unnamed experts may indicate a false news story,” a tip that might cause readers to discount reporting that uses anonymous or background sources, as many whistleblowing investigations do. More worryingly, the Trump administration has used this tactic to question critical reporting from a variety of outlets. [Did I mention that the Tsarina is a Thoroughly Progressive Tsarina?]
Overall, platforms’ efforts to promote media literacy awareness have been surface-level distractions, not deep-rooted attempts to change user behavior. They should be developed in broad consultation with experts in the field and be integrated into the user interface of the platform – not buried in a help section or in a print newspaper advertisement – at the platform’s expense. Like reminders to grab an umbrella or do your civic duty, social media platforms can empower users to be more discerning consumers of information. It is not a question of ability; it’s a question of volition. [p. 145, footnote omitted; emphasis added]
What the tools “integrated into the user interface” would look like is hard to discern, but it certainly sounds like a mechanism not simply to warn users about trollish Internet tricks but to make it impossible, or nearly so, to get access to government-disapproved content. That the Tsarina favors limiting choices rather than relying on users’ judgment is suggested by the scenario with which she closes the chapter, a hypothetical happy ending six years hence.
It’s Election Day, November 2028. The latest polls are close; it will probably be another nail-biter. But unlike eight or twelve years ago, this campaign hasn’t been characterized by the rancor of the past. After the scandalous foreign hack-and-leak operations that marked the 2016 and 2020 elections, affecting first the Democratic and later the Republican Party, a remarkable thing happened. Party leaders, editors-in-chief of the country’s newspapers, major broadcasters, and even the social media platforms convened to sign a Declaration on Truth and Integrity: they would not discuss, cover, or amplify content for which a legitimate source could not be established. This bipartisan effort did not eliminate all attempts at foreign interference, but rather than boosting the bogus claims, repeating them on the airwaves, printing them on front pages, and retweeting and sharing them online, it relegated them to the fringes of the internet, where they never gained much traction. The publicity surrounding the declaration served as a primer and awareness campaign for millions of Americans who believed for years that foreign interference was a specter politicians unleashed for personal gain.
In the months after the declaration, while the new administration settled into office, Congress took stock of what the document had achieved. It engendered cooperation between politicians and the fourth estate, across social media platforms, and between all these groups and law enforcement, which was notified as suspicious posts were encountered. They worked to build on the informal structure they created, creating an independent government hub to serve as coordinator, watchdog, and educator. The new body worked with civil society to write a curriculum for schools and professional development programs, doled out educational grants, and advocated for users’ rights in the development of new technologies on social media platforms. It also acted as a neutral researcher, employing an apolitical group of experts to act as a watchdog on questions of hate speech, political bias, disinformation, and “fake news,” among others. The government allocated greater funding to American public broadcasters, encouraging a more balanced news landscape.
In 2028, the information war is still not won. There were and continue to be challenges. It is not utopia; the project of democracy will never be. Online abuse, while on a downturn, is still a problem. But people are beginning to have more inquisitive, respectful conversations, based on responsible, level reporting. They are recognizing the humanity in their fellow avatars. Americans regularly have heated debates about how to protect free speech in the age of the internet. But unlike during the height of the information war, that speech is not being manipulated to drive division and chaos from without and within.
This may only be a dream for the future, but it is one that feels possible to achieve. [pp. 146-47]
The dream of the totalitarian utopians, from Rousseau and Robespierre to Marx and Lenin: All will be peace and harmony when the wise and the virtuous shape how everyone else is permitted to think.
Further Reading: Jonathan Turley, “Biden’s ‘Mary Poppins of Disinformation’ the perfect nanny to tidy up mess of free speech?”