Insights

Takeaways from Harvard’s Fake News Conference

A recent conference gathered a few dozen academics, all concerned about the future of news, in a Harvard Law School conference room. In a media climate characterized by an apparent disregard for facts and with the president himself seemingly determined to undermine trust in the press, participants were determined to solve the 64 million-dollar question: How do we solve the problem of fake news?

The summit brought together psychologists, sociologists and computer scientists to shed light on the channels through which fake news is distributed. Its organizers also welcomed legal scholars—to explore the limits of potential legal interventions—and communications scholars and political scientists to help understand the impacts of fake news.

The event, “Combating Fake News: An Agenda for Research and Action,” was held February 17 and was co-sponsored by Harvard University’s Shorenstein Center on Media, Politics and Public Policy and the Ash Center for Democratic Governance and Innovation, along with Northeastern University’s NULab for Texts, Maps and Networks and Network Science Institute.

“We need to understand what sort of people are drawn to fake news and what about it appeals to them,” said Matthew Baum, a professor of public policy at Harvard and one of the event’s organizers. “We see this conference as the start of a conversation rather than a standalone event.”

https://twitter.com/InterwebzNani/status/832588756737724420

Fake news polarizes, fueled by viral networks

David Lazer, a distinguished professor of political science and data science and co-director of Northeastern’s NULab for Texts, Maps and Networks, had an ominous take: “I would define fake news as a sub-genre of misinformation, as misinformation regarding the state of the world, as disregard for the facts,” Lazer told attendees. “It simultaneously misinforms often by appealing to the very worst of our human nature.”

Lazer believes that misinformation, especially as it permeates the online landscape, contributes to political polarization. According to a 2016 report from the Pew Research Center, 93 percent of Republicans are more conservative than the average Democrat today, while 94 percent of Democrats are more liberal than the average Republican. Two decades ago, those numbers were much smaller: 64 percent and 70 percent, respectively.

Cass Sunstein, a Harvard Law professor who has a long history of legal and political experience, gave a keynote address that focused on the nature and dangers of groupthink. “Once your initial inclination is corroborated, you become more confident. And once you become more confident, you become more extreme,” Sunstein said.

Echo chambers don’t help fight polarization or extremism, said Sunstein. 

“If you get a group of people who think, let’s say, President Trump’s immigration policy is an excellent idea,” Sunstein said, “the arguments that favor the predisposition will be more numerous.” Additionally, if people listen to one another, they’ll create more arguments favoring that same predisposition.

Maya Sen, an assistant professor of public policy at Harvard who moderated the conference’s first panel, pointed out that people typically don’t like listening to or reading things they disagree with. However, that isn’t limited to opinions.

“People are in echo chambers of like-minded facts,” Sen said. Some of Sen’s recent research focuses on the avenues of distribution for fake news. While Twitter and Google play a role, she said, Facebook really stands out. Still, she doesn’t believe that fake news alone led to President Donald J. Trump’s election.

“There have been ideas out there that fake news swung the election,” Sen said. “I think that’s an implausible claim.”

Lazer shared research on fake news that he and his team had conducted using Twitter data. They aimed to find the sources of fake news: who was sharing the misinformation and who was retweeting it? “Seventy percent of fake news was shared by 15 people out of our 22,000,” Lazer said. He referred to these 15 users as “super-sharers”—super-active users who give their account information to third parties so that they’re tweeting almost constantly.

Even in October, right before the election, some of the most popular content shared—fake news—was tweeted by these super-sharers.

The load of information weighs on the system. Filippo Menczer, professor of informatics and computer science and director of the Center for Complex Networks and Systems Research at Indiana University, said that while individuals tend to be personally biased, virality becomes skewed based simply on the quantity of items.

DON’T MISS  Update: How to geocode a CSV of addresses in R

“If you don’t have this personal preference, some things will still go viral,” Menczer said. “But when you add that actual individuals have a preference for sharing better-quality information, you will find that if there is a large load of information, […] then the system as a whole is incapable of discriminating among information based on its quality.”

Menczer added that while Trump may not be the best example of the system as a whole, after Twitter bots mentioned @realDonaldTrump with a fake news article from InfoWars alleging that 3 million immigrants had voted illegally, he stated it as fact.

Facts aren’t always loud enough

Adam Berinsky, a professor of political science at M.I.T. and director of the Political Experiments Research Lab, began his talk by projecting a photo of a political button given to him by a resident assistant in 2012. The button showed former President Barack Obama with the words “Made in the U.S.A.” and a photo of his birth certificate. Berinsky cited a July 2010 poll—conducted in the midst of the “birther” movement that called into question Obama’s birthplace—that found that 55 percent of Americans believed Obama was born in the U.S., 27 percent did not and 19 percent were unsure.

“Even releasing information doesn’t stem the tide of rumors,” Berinsky said. “This really is genuine uncertainty, but it’s tinged with a healthy dose of skepticism from these people. Moving forward, these are the people we want to focus on.”

Berinsky grew up in New York City in the 1970s and 80s. During that time, he said, misinformation would be pasted to the sides of construction sites, alleging conspiracies about the 1984 election between Ronald Reagan and Walter Mondale. The fliers were easy to ignore.

“Now, we get rumors like this,” Berinsky said, referring to Trump’s allegations that millions of people voted illegally in the 2016 election. “Trump,” Berinsky said, “is the megaphone of rumors.”

Fake news uses real journalists

However, many people are not hearing rumors directly from @realDonaldTrump. According to Emily Thorson, an assistant professor of political science at Boston College, news consumers are more likely to come across false statements in the context of corrections.

“I want you to think about the first time that you heard Donald Trump say that crime was at the highest rate it’s been in 47 years,” Thorson said. “You were probably not at the sheriff’s meeting where he said that.” Instead, most news consumers learned about the comments from news websites debunking the claim.

Thorson is concerned that when publications like The Washington Post and The Wall Street Journal take such pains to fact-check false claims, they also make those claims more widely known. “When people are so aggressive about putting out corrections, they also, in the process, publicize the misinformation,” Thorson said. “Even when we cognitively know that the misinformation is false, it sometimes isn’t enough to erase that automatic and unconscious effect. That can linger.”

Rebuilding trust in the press requires working with outsiders

In the U.K., information is not so partisan. According to Helen Boaden, former director of BBC News and BBC Radio and Joan Shorenstein Fellow at Harvard Kennedy School, the BBC still reaches 94 percent of the U.K. population each week. She believes that consumers largely trust it because the BBC not only provides news, but also services like radio, music stations and comedy television, among others. For Boaden, being vital to an audience is paramount.

“We have a moral obligation to make sure everyone gets something they care about from us on a regular basis,” Boaden said.

Lori Robertson, too, focused on the relationship between news outlets and consumers. Robertson is managing editor at FactCheck.org, a nonpartisan, nonprofit project of the Annenberg Public Policy Center at the University of Pennsylvania.

“We like to say that the news readers are really the first line of defense against fake news and misinformation in general,” Robertson said. “We’re trying to give them some clues on how to spot it.”

https://twitter.com/rswalrath/status/832668545733447681

According to Robertson, many people who contact FactCheck.org to verify a story or a claim are already skeptical. This is healthy, and it’s something she and her colleagues want to nurture.

DON’T MISS  How the Wall Street Journal conducted a video investigation into the role of the Proud Boys at the Capitol

Eli Pariser, co-founder of Upworthy, a site that aims to make “meaningful” content go viral, said that when he founded the site in 2012, viral content was mostly categorized as things like Keyboard Cat and “a FAIL video of some idiot surfing off his roof.” Politics and international affairs, on the other hand, were considered boring by the general public. In a way similar to the BBC, Upworthy aimed to reach readers by appealing to their interests.

“When journalists talk about trust, they talk about it in a way that is very divorced from what I heard social scientists talking about in the morning,” Pariser said. “If what we want to do is engender trust, then we need to think about how we demonstrate to people that we have their interests at heart.”

Pariser also brought up an institution that has come under severe fire over the last year: Facebook. First with its Trending Topics scandal last May, then with its status as a breeding ground for fake news post-election, Facebook has been attacked and questioned by media critics who say that it is an information distributor, not merely a platform—and it had better start acting like one.

“It’s kind of unacceptable, in a democracy, to have that little insight into a place where information is being transacted,” Pariser said.

David Rothschild, an economist at Microsoft Research, shared Pariser’s desire for greater transparency but said Facebook has historical reasons not to.

“We need to open these places up to research, and we need to create the incentive to make them want to do this, because I can understand why Facebook wouldn’t,” Rothschild said.

For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged in, according to The Atlantic. Some people were shown content with happy and positive words; some were shown content that was sadder than average. When the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

The experiment, and its results, were published in a 2014 paper. Outrage ensued, with Twitter users calling the study “terrifying” and Facebook itself “awful.” Since then, Facebook has not published any comprehensive data.

Many attendees at Combating Fake News noted that there were no Facebook representatives present. However, at least one—Alex Leavitt, a quantitative user experience researcher at Facebook—was in attendance.

Fear undermines institutions, but we can move the truth forward

So why do people continue to spread fake news? According to Michael Schudson, a professor of journalism at Columbia University, people become susceptible to rumors and falsehoods through “fear, anxiety, resentment, but fear most of all.” Additionally, when friends and family—people we trust—share fake news stories, we’re more likely to believe them. “Endorsement by people we should believe, people we normally believe to be credible authorities, makes fake news more believable,” Schudson said.

Yochai Benkler, faculty co-director of the Berkman-Klein Center for Internet and Society at Harvard, painted a bleak picture. He began by noting that while Fox News undermines traditional media, Breitbart undermines even Fox. Though he spoke academically, he expressed real concern for the future of the country.

“The core problem,” Benkler said, “is, ‘What happens to democracy in the presence of intentional efforts to deny the validity of basic methods of defining the range of reasonable disagreement? Of trust in the professions—journalists, scientists and now judges?’ ”

By the end of the summit, a conference where fake news dominated the conversation, the main takeaway was to keep fake news from dominating the conversation. Adam Berinsky of M.I.T. stressed the importance of pushing forward with real stories from real outlets.

“Don’t repeat misinformation,” Berinsky said. “Seek an alternative narrative. And a really big thing is finding credible sources to counter.”

Rowan Walrath

Leave a Reply

Your email address will not be published. Required fields are marked *

Get the latest from Storybench

Keep up with tutorials, behind-the-scenes interviews and more.