Skip to main content
WhatsApp is Among the Leading Platforms for Information About Politics, Rutgers Study Finds
This new research, co-led by SC&I Assistant Professor Kiran Garimella, aims to help address and tackle the spread of political dis- and misinformation on encrypted messaging apps.
These new findings “will exclusively benefit a diverse range of audiences, who are typically underserved by the mainstream media,” wrote study co-author SC&I Assistant Professor Kiran Garimella and his colleagues at the University of Texas at Austin.

A new Rutgers study examining false or misleading information posted to WhatsApp that specifically targets three diasporic communities in the U.S.: Cuban Americans, Indian Americans, and Mexican Americans, reveals that false and misleading information about the news or current events received on WhatsApp ranked higher than most other forms of nefarious content and was only exceeded by scam or phishing content.

A whitepaper presenting their findings, “Talking Politics on WhatsApp: A survey of Cuban, Indian, and Mexican American diaspora communities in the United States” co-authored by SC&I Assistant Professor of Library and Information Science Kiran Garimella, was published by the Center for Media Engagement at the University of Texas at Austin in October 2022. 

Garimella said he and his co-authors at the University of Texas at Austin chose to study these three communities for this project because they are regular users of WhatsApp, and more critically, they constitute an important part of the American electorate and are thus now being heavily targeted on WhatsApp with dis-and misinformation about American elections and politics in an effort to “digitally shape public opinion.”

Providing background on the origins of scholarly interest in the spread of false information on WhatsApp, Garimella and his co-authors wrote, “Spurred on by the 2016 U.S. presidential election and Brexit, scholars have begun to devote a great deal of time and resources investigating how false information spreads, who produces it, and/or how it affects politics . . . and little has been done to generate knowledge about how false information – particularly false political content and content about critical events – circulates among diaspora communities in the United States.”

This is a critical time to undertake this research, Garimella noted, because significant American elections are coming soon: so far one special election to the U.S. House of Representatives is scheduled for February 21, 2023, and the next U.S. presidential election is scheduled to take place on November 5, 2024.

By conducting surveys with members of the three U.S. diasporic communities about the content they consume on WhatsApp, Garimella et. al. wrote their findings show:

  • “WhatsApp is among the leading platforms for inaccurate information about politics.
  • Some people believe in concrete conspiracy narratives, despite perceptions of low levels of false information on the app.
  • WhatsApp is crucial for personal use – but politics is discussed there, too.
  • False information is most often thought to come from strangers – but it also comes from close connections.
  • People think there is little false information out there.”

The whitepaper stems from work Garimella and his co-authors are conducting for their larger project titled “WhatsApp Monitor: A System for Fact-checking WhatsApp Content from Diaspora Communities.” Funded by the Knight Foundation, the timeline for this project is March 2022 through February 2024.

Through this project, Garimella and his colleagues at UT Austin wrote they seek to answer the questions, “How much content consumed by diaspora communities on WhatsApp is misinformation? Given the end-to-end encryption, what is the right way to fact-check on WhatsApp?”

The goals for the project, they wrote, are three-fold: “First, to develop a web-based monitoring system tailored towards the needs of Latinx and South Asian diaspora communities in the U.S., specifically, Florida, Texas, and North Carolina; second, and with this tool, to help fact-check content from these communities on WhatsApp; and third, to understand the dis-, and misinformation ecosystem on WhatsApp more broadly which will deliver lessons that can be applied for developing efficient fact-checking solutions for all content on encrypted messaging apps.”

Garimella at Rutgers and the UT Austin team led by faculty member Samuel Woolley will combine qualitative and quantitative research methods. Garimella’s colleagues at UT Austin will conduct the qualitative work by conducting interviews and organizing focus groups with members of the three diaspora communities they are studying.

Garimella will lead the quantitative work by using large-scale WhatsApp data to track disinformation and to develop a monitoring tool, a web-based dashboard, which will be similar to one he developed previously that was used by over 100 journalists, newsrooms, and fact-checking agencies during national elections in India and Brazil.

This new dashboard, Garimella said, will “aggregate content from multiple groups to identify the most (‘viral’) popular images, videos, audio, texts and URLs shared in the WhatsApp groups we are monitoring.”

To ensure the data they collect makes a practical, applied, and global impact, they will provide communities and sources they trust (e.g. journalists and fact-checkers) with access to the dashboard. Garimella and his co-authors explained that, “These sources will help fact-check the viral content posted in these groups. Once a piece of content is fact-checked by any of the stakeholders, using automated push mechanisms, they will publish the result of the fact-check back to the WhatsApp group where the misinformation was originally received, to notify the WhatsApp users that they have been exposed to dis-and mis-information.”

Further, they explained in their proposal to the Knight Foundation, they will then analyze all the data they collect and conduct follow-up interviews with the three communities to identify best practices and develop solutions to tackle misinformation.

Their aim, ultimately, they said, is to assist three major stakeholders: academics; diaspora communities and NGOs representing these communities; and journalists/fact-checking agencies.

Academics will benefit from the data they collect on “consumption habits from minority communities on under-studied platforms like WhatsApp.”

Diaspora communities will benefit, they wrote, because the “project provides a voice to the communities and helps expose coordinated networks that target and spread disinformation in these communities. It also incorporates a design justice-based approach by actively involving these communities in deliberative processes for dealing with false information.”

Lastly, journalists and fact-checkers will benefit, they said, because they “will have an improved reach and an understanding of this typically understudied population. They will also gain new tools that will enable scalable fact-checking and journalism that can directly reach marginal populations.”

Discover more about the Library and Information Science Department at the School of Communication and Information on the website

Back to top