If you ask pretty much anyone if news feeds and algorithms have contributed to the polarization of American society in the past 20 years, the answer will likely be a resounding yes. We take it as a given that big tech has steered our culture into a more contentious place than ever. But new research shows that this might be an overly simplistic reading of the situation.
From the Washington Post,
The first results of that research show that the company’s platforms play a critical role in funneling users to partisan information with which they are likely to agree. But the results cast doubt on assumptions that the strategies Meta could use to discourage virality and engagement on its social networks would substantially affect people’s political beliefs.1
The researchers conclude that while Facebook is great at reinforcing beliefs people already hold, Facebook is not nearly as effective at changing someone’s mind. The beliefs we bring into these online interactions are the litmus test for us about what’s “fake news” vs. what’s new information to add to our repository.
This story made me think about an article in the Summer 2023 edition of The New Atlantis. Writer Adam Elkus shares:
So much intellectual effort has been devoted to the reasons why machines — bureaucratic or technical — might execute human desires in a way vastly different than human beings intend. Little to no effort has been exerted in exploring the converse: how humans might confound machines trying to get them to do what the machines want.2
For years, we have taken as a given that humans could be shown information by an algorithm that would change the way they think, but that theory of how the world works doesn’t consider the “friction” caused by humans being humans. Friction is “the way, broadly speaking, the world pushes back when we push forward,” Elkus writes. “Battlefield information proves faulty, men and machines break down, and all manner of other things go wrong. And of course, the greatest source of friction is other people.”3
When we imagine machines slowly leading us to our demise (via social media algorithms polarizing our country or Skynet slowly taking over the world), we typically envision a frictionless situation where everything goes right for the machine. “The computer programs — representing purified, idealized intelligence — never encounter any serious difficulties, especially in getting humans to do what they want, despite being totally dependent on others to act due to their lack of physical embodiment. Because the machines are simply so much smarter than us, they are able to bypass all of the normal barriers we encounter in getting others to do what we want, when we want, and how we want it.”4
So while algorithms are excellent at feeding us content that we will engage with, at the end of the day, humans will be humans. And humans being humans is the greatest source of friction the news feed algorithms have ever encountered: despite exposing us to countless pieces of information that could change our minds, we generally persist with what we believed in the first place.
This is not to say that we shouldn’t be alarmed by many of the other findings in this study (and there are many things to be alarmed at), but I think how this study complicates our picture is also enlightening. Polarization is not a problem that a silver bullet can solve. We can’t just “fix social media” and count on everything else to be okay. You can’t engineer the perfect algorithm to serve specific people just the right pieces of information and call it a day.
The way to change hearts and minds is much more nuanced than that. While yes, at the margins, the information we encounter via these mass media has an impact, the greatest impact comes with humans being humans and interacting with each other in ways that computers and mere exposure to ideas can’t compete with.
With that, thanks for reading, and see you again soon.
Social image by Brett Jordan on Unsplash