How did we end up being so angry with those who have opposing political views?
Back in the 1960s, the split was tiny. The number of Republicans or Democrats who said they would be concerned if their child married someone from another political party was small.
But in 2010, about 50 percent of Republicans and one-third of Democrats said they would be upset about such an “intermarriage,” according to the research done by Iyengar, Sood, and Lelkes.
What’s to blame? Social media platforms, such as Facebook, argued Jaime Settle, an associate professor of government and director of the Social Networks and Political Psychology Lab at the College of William & Mary in Virginia. Settle spoke at the Misinformation Speaker Series held by NULab of Northeastern University on Oct. 2.
Settle, sharing the main argument from her book, “Frenemies: How Facebook Polarizes America,” said that the way we communicate on Facebook helps facilitates the polarization of the American public.
A broad range of posts is considered “political”
Most people don’t use Facebook for political purposes. We browse through Facebook to watch activity feeds from our friends without the intention to look for political content, Settle said.
But there is an intriguing puzzle in statistics offered by Settle: “78 percent of Facebook users said they don’t or seldom share political content while 69 percent of them said they understand others political views through Facebook.”
The inferences we make about others’ political viewpoints stem from the circulation of expression through posts, news content or sources he or she shares, and discussion around a variety of topics, she said.
One experiment Settle designed had participants read a Facebook post of news headlines, such as “School Children Rescued Out of Bus After Snow Storm Accident” that the participants did not reach a consensus as to whether they were political or not. Settle put different sources of the news on the same headline. When the source was listed as Fox News, half of those polled said it was coming from a Republican, compared to 40 percent without any sources of a header on it.
This showed that although people do not think they are sharing or reading political content on Facebook, they judge others’ political views depending on the kind of content they share on Facebook, Settle said.
“We are diverse, and they are all similar.”
The filter bubbles Facebook created through mysterious algorithms strengthen our identity recognition and reinforce our biases against others, resulting in intensifying the out-group homogeneity effect, Settle said. Ｗe believe that conservative people are surrounded by conservative friends and that extremists have friends with the same extreme ideology.
As a result, more people develop a negative attitude toward not only elite political figures but also people they know that they disagree with on politics, said Settle.
Especially when reading posts from the out-group members with disagreeable viewpoints, we tend to highlight the homogeneous impression and put political labels on them based on our biases, she said.
We keep on practicing the process of observing, judging, and labeling, which makes us more and more confident to classify others’ political inclination, Settle said.