(Editor’s Note: This review discusses Invisible Rulers: The People Who Turn Lies Into Reality by Renée DiResta (Public Affairs, 2024, 436 pages)) 

Renée DiResta’s new book, Invisible Rulers: The People Who Turn Lies Into Reality, provides an essential guide to what she calls “the carnival of mirrors” of the information environment. Formerly the technical research manager of the Stanford Internet Observatory, a research group which studied the abuse of social media platforms, DiResta (who I know professionally) offers a unique combination of cool-headed analysis, historical perspective, and passionate first-person perspective on the upside-down world of online falsehoods.

The book is not without flaws, but it makes as important a contribution to the field as any work I have read. In fact, Invisible Rulers is so clear and free of jargon that university professors might consider using it as the core text for a seminar on the 21st century information crisis. 

A Mom Confronts Misinformation 

DiResta brings an unusual set of credentials to her task of explaining the misinformation ecosystem. She spent years as a trader on Wall Street and in the world of venture capital, acquiring the quantitative chops that distinguish her from many policy analysts. Put simply, she understands how computers and software work, and that gives her writing authority in this area. DiResta’s interest in misinformation was initially sparked by an online campaign against measles vaccines. As she recounts in the book’s opening, in 2015 DiResta was concerned about her child going to school and facing the risk of infection by a disease that had been all but eradicated in the United States decades earlier. As DiResta explains, the “networked activism” powering this pre-COVID anti-vax movement presaged much of what was to come, not only in public health debates, but also politics and civic life generally in the United States.

“I got a firsthand experience,” she writes, “of how a new system of persuasion—influencers, algorithms, and crowds—was radically transforming what we paid attention to, whom we trusted, and how we engaged with each other.” It was not a transformation for the better.

In the waning days of hazy utopian thinking about the internet and social media, DiResta was already grappling with devotion to online conspiracy theories becoming a form of personal identity, which could transcend appeals to fact or rationality. She saw bot and troll armies assembling, found herself targeted for harassment by brigades of Twitter users who “doxed” (revealed personal information) about her and her allies, and witnessed the “filter bubble” distortions generated by Facebook Groups. Platform algorithms tuned to maximize user “engagement” (for example, “likes,” “shares,” and comments) in the interest of attracting advertising dollars had the effect of amplifying sensational content. The foot soldiers indoctrinated by these cynical recommendation systems came from diverse corners, according to DiResta: anti-Obama Tea Partiers, soon to morph into MAGA-ites; Church of Scientology believers; Nation of Islam adherents; armed militia members; and QAnon autodidacts receiving what they thought were signals from their anonymous leader, code-named “Q,” who offered cryptic dispatches about “the deep state” and globalist pedophiles. 

The Rise of ‘Bespoke Realities’

The merger and synergy of conspiracy movements help explain the persistence of various “bespoke realities,” as DiResta calls them, ranging from anti-vax activism to election denialism to the racist “great replacement theory.” Spread via Facebook Groups, YouTube and TikTok videos, and both public and encrypted Telegram channels, these interrelated distortions have contributed to a broader information environment shot through with political confusion, intimidation, and violence. 

In her personal experience and subsequent research, DiResta observed a profound asymmetry of online passion, savvy, and sheer effort, as the influencers and crowds spreading conspiracy ideas out-hustled pro-vax moms, overwhelmed fact-checkers, and surprised public institutions like the Centers for Disease Control and Prevention, which saw the conspiracy supporters as “just some people online” who didn’t deserve serious attention, according to DiResta. 

The upshot: an erosion of public trust in scientific experts, government officials like election authorities, and the mainstream media. The path thus was prepared for Russian interference in the 2016 U.S. presidential election and use of online disinformation to accompany a full-scale invasion of Ukraine; more than 200,000 preventable COVID deaths in the United States alone; Donald Trump’s disingenuous claims of election fraud in 2020 and attempt to thwart the peaceful transfer of power; and the January 6, 2021, attack on the Capitol. Now seeking to return to the White House, Trump is using his Truth Social platform, as well as Elon Musk’s X, to foreshadow a renewed assault on election integrity, as he vows retribution against his political foes. DiResta’s book illuminates not just recent history, but today’s headlines.

Influencers, Algorithms, and Crowds

She usefully structures her analysis – and ascribes responsibility for the social ills she describes – based on three categories: influencers, algorithms, and crowds. Influencers range from Trump and Musk to anti-vaxxer and erstwhile former independent presidential candidate Robert F. Kennedy, Jr.; Pizzagate and pro-MAGA provocateur Jack Posobiec; Amazing Polly, a QAnon thought leader who helped instigate online hysteria about a phony child trafficking plot involving the home furnishings company Wayfair; and Matt Taibbi, a formerly left-leaning journalist who made an unlikely transformation into an ally of right-wing politicians like Representative Jim Jordan (R-OH) in a McCarthyite “investigation” of DiResta and certain other misinformation researchers they falsely accused of conspiring with social media companies and the Biden administration to silence conservatives. Motivated by ideological goals, raw thirst for attention, clever monetization schemes, or some combination of the three, influencers share a talent for manipulating engagement-sensitive social media algorithms in a manner likely to get the software systems to amplify their content. 

But critically, DiResta emphasizes, none of this would have impact without the crowds – the individuals who interact with and, above all, share influencers’ content when prompted to do so by various recommendation engines. Users have agency and exercise it. “We” – or at least subsets of “we” who get caught up in rumors, smears, and dark fantasies that appeal to preconceived biases – are part of the problem, in her view.

DiResta’s common sense reverberates throughout her book. She alludes to academic debates “about whether propaganda worked, whether rumors mattered, whether ‘fake news’ had an effect, and whether online content could truly precipitate offline action. We debated whether media or social media bore ‘more’ responsibility, as if those systems were still distinct.” 

A quick quiz to underscore this last point: If someone’s X (formerly Twitter) feed carries a Fox News segment highlighting false election-fraud claims by Donald Trump, is the source of potential influence the social media platform, Fox News, or Trump himself? Answer: all of the above. DiResta reminds readers of January 6, 2021: “Here were crowds of people screaming the online slogan ‘Stop the Steal’ and smashing Capitol building windows. It certainly suggested that what happened online did indeed matter offline—online vitriol wasn’t just performative.”

Countering Revisionism 

Facebook Groups, where QAnon and Stop the Steal took shape, were not solely responsible for the Capitol siege; many people, platforms, and media outlets played roles. But Meta’s promotion of Facebook Groups and online tools that made it easier for organizers to invite MAGA participants to join the action played vital roles, too.

In 2024, the strange notion that the backlash against social media may have been misguided has bled into the tech and mainstream media, as illustrated by a dispatch from the prominent Silicon Valley newsletter writer Casey Newton entitled, “The Misinformation Panic May Be Over.” Newton points to research that he reads as undercutting the common perception that “anyone can now use a big social media app to promote falsehoods to a global audience . . . and platforms have done too little to rein in bad actors and promote high-quality information. The result is a fragmented and decaying information environment that undermines the foundations of democracy.”

It is indeed an exaggeration to say that anyone can now manipulate national or global audiences via TikTok or Instagram or Telegram. That takes skill, persistence, and an understanding of how crowds and algorithms behave. But DiResta’s book is a 436-page refutation of glib revisionism that anxiety about misinformation constitutes a baseless “panic.”

Subpoenas, Lawsuits, and Harassment

The most memorable portion of Invisible Rulers comes near the end when the author recounts her personal travails in 2023 and 2024 as a target of the highly coordinated partisan attack on misinformation researchers. She offers vivid portraits of Taibbi, Jordan, and the rest of the cast of characters arrayed against her and colleagues like Kate Starbird, a computer scientist at the University of Washington and faculty director of the Center for an Informed Public. Deploying congressional hearings and subpoenas, lawsuits, and incessant online harassment, this crew wove an unsubstantiated web of accusations about government and Silicon Valley “censorship,” which, in DiResta’s telling, diverted resources, crippled research efforts, and did serious psychological harm to some of those targeted. 

These attacks continue. Stanford, saddled with steep legal defense expenses and unwanted controversy, effectively pulled the plug on the Internet Observatory where DiResta formerly worked, and did not renew her contract. As a pivotal U.S. presidential election approaches, a network of energetic misinformation researchers who sounded important alarms in 2020 has gone largely silent – an ominous epilogue to DiResta’s book.

Invisible Rulers has imperfections. The title, borrowed from the 1928 book Propaganda by Edward Bernays, a social theorist and pioneer of corporate public relations, doesn’t make sense. Bernays wrote about public relations gurus, advertising executives, and top government aides who shaped public opinion a century ago while remaining truly anonymous. By contrast, the influencers, platforms, and digital mechanics that DiResta so adroitly describes are not invisible. They’re largely not being held accountable, but that’s a reflection of corporate irresponsibility and lack of political will, not anonymity. The book also has a long middle section, which tends to circle back, restating DiResta’s themes and insights, rather than rewarding the reader with a sense of forward momentum. 

But these are quibbles. Overall, Invisible Rulers succeeds in explaining why and how we live in a carnival of mirrors where it has become so difficult to achieve consensus on the most basic of facts. 

IMAGE: The logo social media and social networking site X (formerly known as Twitter) is displayed centrally on a smartphone screen alongside that of Threads (L) and Instagram (R) on Aug. 1, 2023, in Bath, England. (Photo by Matt Cardy via Getty Images)