How disinformation fuels political polarization

Insights

Ten years ago, scores of messages and videos about freedom and democracy shared on Twitter and YouTube by activists in Egypt triggered the Arab Spring, which brought permanency to the media’s coverage of digital and social media activism and continues to be actively researched in journalism. 

Deen Freelon is an associate professor at the University of North Carolina Hussman School of Media and Journalism and was a keynote speaker for the Computation+Journalism Symposium hosted by Northeastern University and Brown Institute for Media Innovation in February. Freelon, whose research focuses on misinformation and political polarization, provided insights about the dichotomy in the scholarly research and news coverage of the left and the right and the implications for journalism and computation in making sense of how each political side engages with disinformation and misinformation on digital media.

Conservatives trust mainstream news less and are more likely to share, engage with, and believe false news

Ideological asymmetry, which refers to systematic differences in the beliefs and behaviors of the political left and right, has “deep roots in American political and media life,” Freelon said. Conservatives also have a “stronger preference to form ideological echo chambers,” which are situations where people are “primarily consuming news from their side of the political aisle.” This leads them to either shut out or dismiss mainstream news. 

“Conservatives tolerate politicians spreading disinformation, where they are more okay with the spreading of false news by politicians that they favor,” Freelon said. “We saw this here in the United States throughout the Trump administration.”

Little is known about left-wing disinformation and right-wing hashtag activism

Freelon described hashtag activism as a “more effective mode of online activism for the left,” as using hashtags “as a means of promoting various social causes and means to an end to trying to get the social goals.”

Similarly, the right has “its own alternative media ecosystem,” Freelon said, that is “disproportionately tolerant of mis- and disinformation and closely networked with ideological leaders including Donald Trump.” 

While it is known that the left does not engage with mis- and disinformation to the same extent, “that doesn’t necessarily mean there is no disinformation on the left,” Freelon said. It also does not mean that the right doesn’t engage in hashtag activism. 

“It simply means that there’s been a lot of research on left-wing hashtag activism and right-leaning disinformation,” Freelon said. “Those alternate squares in the grid are a lot less investigated at this point.”

Disinformation dystopia, Freelon said, is a narrative seen throughout media coverage and academic research that “paints disinformation as an overwhelming if not exclusive problem for the right.” 

In exploring the extent to which that narrative is true, Freelon outlined two possibilities. The first is that the literature and journalism are “actually true in [representing] the way things actually are.” The second possibility is that there is insufficient research and journalism on right-wing activism and left-wing disinformation, and “the combination of these two aspects is generating an incomplete empirical portrayal.” 

The truth may also be somewhere in between. 

“Academics love to say that now but it’s possible that the truth may be further to one end of this spectrum than the other,” Freelon said. “This is really something that both journalists and academics should really try to focus on in the years ahead.”

The idea of false balance

While there may be left-wing disinformation, Freelon said “the evidence is quite clear that right-wing disinformation poses a very distinct danger,” as seen during the insurrection at the U.S. Capitol on Jan. 6. 

Freelon said he does not consider the presence of left-wing disinformation to be a false balance at all because the content of disinformation can be shared and read by both sides. The question of consequences, Freelon said, is an “analytically distinct question” that is “totally separate from the prevalence of the disinformation itself.”

“I think that if there was any doubt before the Jan. 6 insurrection of the Capitol of the disparity in terms of the consequences of right-wing disinformation relative to those of left-wing disinformation, I would hope that those questions have been definitely settled by those events.” 

Disinformation spreads unrelated to mainstream news

“You can’t simply say, ‘OK, just because mainstream news doesn’t engage with this therefore it’s not a problem,’ ” Freelon said. 

The story about former President Trump being involved in child sex trafficking, Freelon said, was “almost entirely ignored by mainstream media,” but it “picked up 1.1 million retweets within three or four days.” Freelon said a lot of those retweets were by accounts that were not politically relevant, had low followers, were not habitual content consumers and were interested in K-pop, anime and the like.

“There’s a lot we don’t know about this,” Freelon said, highlighting that there are alternative ways of circulating content in the 21st century. Computational journalists and academicians can contribute by tackling questions surrounding who is sharing this content and how it fits with the rest of their media diets.

There is little doubt that at least some left-wing disinformation is out there, but null results are important too

Freelon said looking for left-wing disinformation and not finding it is extremely important to know. 

“You need to know that we left no stone unturned looking for it but we barely turned up anything,” he said.

This knowledge will enable earnest conversations about “the relative risk factors that people on different areas of the ideological spectrum are experiencing,” he said.

Human and non-human disinformation agents belong to the same category

When an audience member asked Freelon’s perspective on bots, he said that there are a lot of helpful bots that don’t lie about who they are, and that he tends to categorize bots “less in terms of the difference between humans and automated social media systems and more in terms of what they do.”

“I think people sometimes automatically assume that they may be harmful or disinformation agents and forget that there are some good ones too,” Freelon said. “For me, the proper way of identifying them is by what they do whether they are human or non-human, not by whether they have blood running through their veins or the silicon equivalent.”

Leave a Reply