50949_WKBW_7_Problem_Solvers_658x90.png

Actions

The 'wild, wild west' of WhatsApp misinformation

More than 85 million people in America use WhatsApp — which is owned by Facebook's parent company Meta
Poster image (19).jpg
Posted
and last updated

As a wild and baseless rumor about an immigrant community in Springfield, Ohio, made its way from a Facebook post and onto a national stage, reporters, internet sleuths, and everyday social media users were able to track where the consequential false claim had originated.

But what if that insidious allegation was made in a group chat on a private messaging app like WhatsApp instead?

"It's kind of the wild, wild west," Jenny Liu, a misinformation and disinformation policy manager at Asian Americans Advancing Justice, told Scripps News. "It's just harder to monitor stuff that's happening in these closed channels."

WhatsApp, one of the most popular private messaging platforms in the U.S., is best known for its encryption. Similar to apps like Telegram, Signal and Facebook Messenger, WhatsApp ensures users' messages in group chats or dedicated channels are private, making them immune to fact-checking.

"With those messaging apps, you kind of had this really double-edged sword," Dr. Inga Trauthig, head of research at the Propaganda Research Lab at the Center for Media Engagement at The University of Texas at Austin, told Scripps News.

"On the one hand, it's seen as like a private space." But on the other hand, Trauthig said, because these apps are encrypted and messages are private, it's nearly impossible to moderate or fact-check content the way Facebook or X does for public posts.

RELATED STORY | Microsoft tracks Russian election influence campaign shift to undermine Harris

More than 85 million people in America use WhatsApp — which is owned by Facebook's parent company Meta, but Asian and Latino Americans make up the majority of WhatsApp users in the U.S. Over half, 54%, of Hispanic adults and 51% of Asian adults say they use the platform, according to Pew Research Center. Whereas 31% of Black adults and 20% of white adults use WhatsApp.

Experts and community leaders say Asian and Latino Americans, the fastest growing groups of eligible 2024 voters, according to Pew Research, are targeted by disinformation on WhatsApp.

"What we've seen is disinformation being used intentionally, not as something that may be, 'Hey I got something wrong and then a fact check it.' No, this has been done intentionally," Domingo Garcia, president of the League of United Latin American Citizens, told Scripps News.

Disinformation targeting Latino Americans has ramped up ahead of November, Garcia said.

"In states like Florida, Texas, Colorado, Arizona, the Latino vote is the vote that could decide who becomes the next president. And if you can get three or five percent of people to change their vote or to just stay at home, that could be the difference in terms of the political power and what happens in Washington, D.C.," he added.

Garcia said false information on WhatsApp uses specific terms like 'socialism' and 'communist' to strike a chord with Latino Americans, especially those with connections to countries like Cuba or Chile.

Propagandists "calling Biden, a Marxist, a communist, and [saying] Democrats are Marxists and communists," Garcia said, "are using WhatsApp as a way to reach Latino voters because Latinos disproportionately use WhatsApp to communicate with their loved ones."

RELATED STORY | Russia, Iran, China are ramping up efforts to influence US election, intelligence says

"We have over 63.7 million Hispanic and Latinos living in the United States, where over 40 million of them speak Spanish at home, like me, like my family. And so when we think about them, you think about 36.2 million Hispanics that are eligible to vote," Evelyn Pérez-Verdía, founder of the strategic communications agency We Are Más, told Scripps News.

Her agency works to meet the "dire need of quality information" in South Florida communities, which are largely Latino.

"Mis and disinformation have always existed in different formats. Now, what happens is that we have a digital borderless format," Pérez-Verdía said.

While borderless, WhatsApp is still difficult for fact-checkers to penetrate. Community organizations have therefore turned to everyday users to help contain – or cut off – the flood of false information.

What's True Crew

Dr. Amod Sureka is a father and physician and in the little spare time he has in a day, he catches misinformation about elections as they enter group chats.

"I think it's going to be people like me who might be a bridge from the people who know even less than I do about politics to the very, gung ho professional politicians," Sureka told Scripps News.

Sureka, who is Indian American, is a member of the "WhatsApp True Crew," a group created by Indian American Impact fighting against election misinformation in South Asian American communities.

"Certainly there are those differences of opinion, but there are also frank disinformation being put out in terms of politician's records," Sureka said.

For example, one WhatsApp message captured by the What's True Crew, and reviewed by Scripps News, appears to attack Vice President Kamala Harris' racial identity. The message includes a link to a 2018 YouTube video, urging Indian American voters to watch it before they cast their ballots in November.

"I will tell you for most of, her Kamala Harris' career, she refers to herself as African American. She rarely talks about her Indian heritage" the video says.

The message featuring the video was sent after former President and Republican presidential nominee Donald Trump falsely questioned Harris's race in July.

RELATED STORY | Truth be Told: How AI is posing a new disinformation threat this election

"She was always of Indian heritage, and she was only promoting Indian heritage. I didn't know she was Black until a number of years ago when she happened to turn Black," Trump said during an on-stage interview.

Sarah Shah with Indian American Impact, the organization behind the What's True Cew, says misinformation like this has "eroded trust in our democracy. It's pitted communities against one another."

What complicates the matter, Shah says, "You are in groups largely with your family and friends and like-minded individuals. And so most people within a group in some way is a trusted messenger."

"The WhatsApp groups that I'm part of, they're kind of built around a given community. For that reason, I think we have this natural trust in people who are like us," Sureka says.

This trust can be exploited by bad actors hoping to influence people in his community who aren't familiar with US politics, Sureka says.

"There's a lot of stuff coming to us, and it makes it really challenging for me to discern what's true or not. And certainly, for people who are less familiar with what's going on," he adds.

Anu Kosaraju, also a member of the "What's True Crew," says she is amazed by the kind of false information she sees in her WhatsApp groups.

"You think of this misinformation or disinformation as little bullets hitting the community," Kosaraju tells Scripps News.

What's scary about the encrypted platform, she adds, is "when it comes from a friend or an acquaintance or a relative, there is an immediate sense of authenticity to it.

"That's when I realized how much WhatsApp had taken hold as an actual source of information as if it was like BBC or some other kind of an official or trustworthy source," she said.

'Slippery fingers'

Half of American adults get their news from some form of social media, according to the Pew Research Center. Meanwhile, major social media platforms are being called out by fact-checking groups for their insufficient response to misinformation, particularly ahead of the 2024 election. These critiques come as content moderation tools, teams and metrics continue to disappear and disinformation campaigns grow in sophistication by the day.

"Users who we spoke to, for instance, from diaspora communities in the U.S. would say, 'Well, WhatsApp is really important for me. It's a private space. I talk about things that maybe I'm not comfortable talking about on Facebook. And I also talk about the news and about politics with my community. So, I don't want anyone from the outside in here,'" propaganda researcher Dr. Inga Kristina Trauthig said.

RELATED STORY | Female election workers face growing threats as disinformation flourishes

False information easily spreads in these groups by what Kosaraju calls "slippery fingers."

"It's one of those things that is so scarily easy with just a finger push, two clicks and then there you are, spreading the same disinformation or spreading something that may or may not be 100% true," she said.

A popular feature on WhatsApp is the ability to forward a message from one chat to another.

"On WhatsApp, you just click a forward button and you can send it," Shah said. "You can broadcast it. You can send it to groups that might have only a few people, but they might also have 250 people. And then others can forward it and there's no real way to track that picture or video, to see its original source."

Encrypted messaging apps are either late or completely missing from the content moderation game. If false content is debunked, removed or labeled as misinformation on another social media platform, it still lives on in WhatsApp messages, Inga said.

"Often tweets would also just be put into a WhatsApp chat and then even if the tweet might not exist anymore, was discredited sometime afterwards. The WhatsApp conversation about it still continues," she says.

"I think that's where the companies can help us – by nipping it in the bud," Kosaraju says. "Don't even let that fake news through."

A spokesperson for Meta, WhatsApp's parent company, told Scripps News it has incorporated new tools to help stop the spread of suspicious or false information, including labels and limits on forwarded messages, banning mass messaging, and providing more support for a large network of fact-checkers.

With no federal regulation of misinformation on social media platforms, the onus to put guardrails in place, especially in an election year, falls solely on companies and their policies. But as in the case of Meta, those policies and how they're enforced can vary depending on which platform — Facebook, Instagram, Threads or WhatsApp — users are on.

WhatsApp says it may take action against accounts that violate messaging guidelines like sharing illegal content, abusing reporting tools to harm other users or engaging in fraud through "purposeful deception or impersonation."

Still, the platform largely relies on users like Kosaraju and Sureka to report false information so the company can act.

"It would be far easier to just let it go and jump over to another group – scroll past that message," Sureka says about misinformation he sees on WhatsApp. "But the first thought is we have to we have to rebut it for anyone who's paying attention to it. Just so incorrect information is not being left unchallenged."