During this year’s election season in Mexico, propagandists leveraged a new mass-broadcasting feature on WhatsApp, called “channels,” to impersonate reputable political news outlets and pump out misleading information. Thousands of miles away in India, the dominant Bharatiya Janata Party (BJP), which has built a massive online communications machine, dubbed the “BJP Digital Army,” waged a ruthless propaganda campaign on WhatsApp ahead of the summer elections. And in the United States, Elon Musk published an AI-fabricated image of Kamala Harris donning a communist uniform on X, which gained wide traction among WhatsApp groups of Hispanic-Americans in South Florida – a prominent constituency of swing voters

Messaging platforms such as WhatsApp, Telegram, and Viber often advertise themselves as spaces for private discussion among family, friends, and other acquaintances. But the platforms have also become highly influential tools for manipulating and misleading voters around the world. 

Our new study, a joint venture from New York University’s Stern Center for Business and Human Rights and the University of Texas at Austin’s Center for Media Engagement, shows that the spread of manipulative political content is intentional, professionalized, and systematic. While billions of people this year have already cast their ballots, it is not too late for messaging apps to take needed action to protect their users from manipulative propaganda.

Weaponization of Messaging Apps

Deceptive or otherwise malicious information that “trends” on social media often originates from content shared on messaging apps. The reverse is also true: Manipulative narratives that originate on social media sometimes take on a new, and extended, life within the messaging app ecosystem of encrypted groups and private chats. 

 Some recent examples in the United States illustrate the trend:

  • Earlier this year, in the wake of protests at university campuses, the messaging app Telegram provided a breeding ground for false conspiracy theories about U.S. President Joe Biden’s alleged links to the terrorist group Hamas. 
  • Fake news articles citing attacks on Indian students at Texas universities have spread on WhatsApp in a targeted attempt to turn the Indian community against local Democrats, who are portrayed as soft on crime. 

The weaponization of messaging apps by political propagandists is not just a problem in the United States. In fact, other countries have experienced this trend earlier and more intensely. Recent elections in the European Union and Asia demonstrate the extent to which messaging apps have become central to electoral campaigns in those regions:

  • In Hungary, before the 2024 European Union elections, propagandists weaponized Telegram’s forwarding bot, which is a turbocharged computational propaganda tool, against LGBTQ+ and pro-democracy civil society organizations portrayed as “Western-controlled.” 
  • During Turkey’s 2024 municipal elections, propagandists set up fake “news” channels on Telegram and WhatsApp imitating the Turkish Radio and Television Corporation (TRT) and other reputable outlets. They used the channels to share deceptive videos portraying opposition politicians as sympathetic to both Syrian President Bashar al-Assad and jihadis in Syria and urged viewers to amplify the content by “reposting” it to their stories or status updates.
  • In the Philippines, the government of former President Rodrigo Duterte established channels and communities on Viber, which it promoted aggressively on social media and through physical posters. Having amassed an audience of more than two million, the government then pumped out pro-Duterte content using Viber’s multimedia creation and dissemination features including sticker packs and bots. 

Political propagandists the world over have developed and refined a digital “broadcasting toolkit,” a set of tactics for reaching large swaths of voters directly on their phones using narratives tailored to their specific interests and views. These tactics and strategies rely on many popular messaging platforms’ growing suite of features and communication tools, which propagandists leverage in sophisticated ways to achieve the viral dissemination of content among their desired target audiences.

Propagandists supporting the BJP in India are masters of such tactics. They have spent years curating systems of distribution on WhatsApp – networks of groups, at the local, regional, and national levels, carefully calculated to maximize engagement and the speed of content dissemination across the app. Using the BJP’s cadre of volunteers, the party then disseminates inflammatory messages meant to exploit specific grievances in the population under the guise of unofficial, third-party accounts. Much of the material falsely portrays opposition candidates as in cahoots with foreign agents, ranging from philanthropist George Soros to the CIA to Muslims generally. In the Mexican example discussed above, propagandists used a fake channel, which they managed to have verified (and gain legitimacy) by WhatsApp, in combination with WhatsApp’s forwarding tool to swarm the app’s group chats with the channel’s content. From there, the “view channel” tag on top of the content drove additional followers to the fake channel.

These unscrupulous tactics are successful in many cases. According to a survey commissioned as part of our study, 62 percent of messaging app users across nine countries – spanning Asia, Africa, Europe, and the Americas – received political content on those apps, and over half (55 percent) of that political content came from people or accounts which the users did not know and did not choose to follow. Further, of those who received political content from strangers, 52 percent said the content had significantly or somewhat influenced their opinions.

Platform Design Enables Propaganda 

Political campaigns using messaging platforms tend to fly under the radar because they create spaces that are more private by design. Many messaging apps are end-to-end encrypted to some degree, making it virtually impossible for outside researchers – and even the platforms themselves – to systematically track content circulating in those spaces. End-to-end encryption, a cryptographic method that renders messages indecipherable to everyone except for the senders and intended recipients, is particularly valuable for pro-democracy activists and dissidents in repressive contexts. In Venezuela’s recent election, for example, opposition activists used encrypted chats on WhatsApp and Signal to coordinate protests in response to evidence of widespread electoral fraud by President Nicolas Maduro and his regime. But it also affords obscurity to the activities of political propagandists and others, including terrorist organizations

Some messaging services – notably Signal – have more clearly chosen to privilege user privacy over abuse mitigation efforts, while remaining pure messaging apps for one-to-one or small group conversations. But other platforms, such as WhatsApp and Telegram, which began as basic messaging services with a simple user interface enabling one-to-one text messaging, have mushroomed into “everything-apps,” or one-stop-shops for digital communications.

The changing nature of these apps can be attributed to the platforms’ business models. Apps like WhatsApp, Telegram, and Viber draw revenue not from private messaging, which tends to be offered as a free service, but from premium features, including business accounts and ads. These same features provide the mechanisms by which propagandists reach constituents en masse while profiting from the relative secrecy of the private messaging ecosystem.

The consequences for democracy go beyond the integrity of the information environment surrounding elections. Due to their encryption and minimal moderation, messaging apps often serve as relative havens for political extremists, such as the far-right “Terrorgram Collective” which seeks to precipitate the collapse of “society as we know it” through violent action. In the United States, such “accelerationist” extremists have used Telegram to promote and coordinate attacks against power grids and other essential infrastructure, raising concerns that a wave of planned blackouts might influence polling places or hinder the certification of election results. Telegram’s hands-off attitude towards such activity on the app led to the August arrest of its founder and CEO, Pavel Durov, by French authorities on the grounds that he violated French criminal laws by facilitating child sexual exploitation and fraud, among other alleged crimes.  

Platforms Can Contribute to the Solution

There are concrete steps that platforms can take to safeguard elections from propaganda and  manipulation without compromising the ability of their users, including human rights activists, to communicate securely through encrypted chats. 

First, platforms should separate their messaging services from their social media and broadcasting services, so that only the messaging features are afforded the privacy and security of end-to-end encryption. Providing nominal privacy to group chats with hundreds of thousands of members – as Telegram does – facilitates large-scale abuse and does not make real-world sense. After all, outside of cyberspace is there any place where 200,000 people can gather without notice? Second, messaging platforms should establish strict limits on the number and pace of account creation, so that propagandists cannot easily operate fake accounts and “phone farms” aimed at artificially amplifying their efforts. Signal, for example, has imposed a limit of one account per device, which is ideal from an abuse mitigation standpoint. WhatsApp and Viber, by contrast, allow the creation of up to two accounts per phone number, but propagandists say they can circumvent those limitations.

Third, in the context of elections, platforms should restrict large-scale broadcasting to verified channels and vetted business accounts. Where a platform enables channels and business accounts to reach large audiences, they should rigorously vet those accounts to ensure their authenticity and compliance with platform policies. Fourth, platforms should empower users with easily accessible in-app fact-checking tools, such as accredited misinformation “tiplines” and tools for reverse image searches, which allow users to retrieve information about an image including its provenance and history of use.

As these apps become increasingly popular, both in the United States and abroad, political manipulation efforts will intensify. Messaging apps should balance their desires to monetize applications with the very real security challenges that are created when “broadcast” mechanisms are shoehorned into previously private communication spaces. By making nuanced changes to these and other platform features, they can protect both their user bases and election integrity.

IMAGE: PARIS, FRANCE – The logos of social media applications, WeChat, Twitter, MeWe, Telegram, Signal, Instagram, Facebook, Messenger and WhatsApp are displayed on the screen of an iPhone.  (Photo illustration by Chesnot via Getty Images)