For many, checking social media has become a routine of logging on, seeing something that makes them mad or upset, and repeating that cycle over and over again.

If that sounds real to you, it’s not your imagination.

Max Fisher is a journalist who focuses on the impact of social media on global conflicts and our daily lives, and has covered it extensively for The New York Times.

In his new book, The Chaos Machine, Fisher details how the polarizing effect of social media is accelerating. He joined All things Considered to explain why tech companies are taking advantage of this outrage and the danger it could pose to society.

“Remember, the number of seconds in your day never changes. The amount of social media content competing for those seconds, however, doubles roughly every year, depending on how you measure it. Imagine, for example, that your network produces 200 posts a day of which you have time to read about 100. Due to the tilt of the platform, you will see the most outraged half of your feed Next year when 200 doubles to reach 400, you will see the most indignant quarter, the year after the most indignant eighth.Over time, your impression of your own community becomes radically more moralizing, aggrandizing and indignant, and so will you, at the same time , less innately engaging forms of content. Truth appeals to the greater good, appeals to tolerance, becoming increasingly outdated, like stars in Times Square.”

– An extract of The Chaos Machine

This interview has been lightly edited for length and clarity.


Interview Highlights

Why Social Media Algorithms Drive Users Outrage

When you log on to Facebook, Twitter, or YouTube, you think what you see is a neutral reflection of your community, and that [your community] speak. When you interact with you think you get feedback from your peers, other people online. But in fact what you were seeing and what you were experiencing are choices made by these incredibly sophisticated automated systems that are designed to determine exactly what combination of messages, how to sequence those messages, how to present them to you most engage certain triggers very specific cognitive skills and cognitive weak points intended to trigger certain emotions. They are meant to trigger certain impulses and instincts that will keep you coming back to the platform to spend a lot of time there.

Those [upsetting posts] are the things that interest us the most, because they speak of a feeling of social constraint, of a group identity that is ‘threatened’. Moral outrage, in particular, is probably the most powerful form of content online. And that’s the kind of content that engages your eyeball, and engages your emotions the most, because it taps into these deeply evolved instincts that we have as social animals, as group animals, to basically self-preservation.

On how this relates to social media platforms to achieve audience goals

So what the systems that govern YouTube and govern what you see have achieved is that to serve that [viewership] objective, they should provide new content that would create a kind of sense of crisis and a kind of feeling that you and your identity were under threat.

So what that could mean is that if you’re looking for, say, health advice, vaccine information, the best thing YouTube can show you isn’t just health information. The best thing YouTube can show you is something that makes you feel like you’re part of a community of, say, mothers who are worried about their children, and that community is threatened by some outside danger. And that it will trigger a sense of alarm, it will make you want to come back and spend more and more time watching.

On the number of people able to use social media without becoming radicalized

For the overwhelming majority of us, the effect is subtle. Spending more time on social media will make you much more polarized, it will give you a much clearer view of people from the other party, or maybe people who are just supporting another figure within the political party you support , it will cause you to hold harsher opinions toward outgroups in general, and it will make you more inclined to feel your own outrage and moral outrage inwardly. I think it’s something we all feel. And that might ring true for those of us who spend time on social media, who don’t become crazy conspiracy theorists, but will feel drawn to it.

On the possible solutions

Every time I would ask the experts who study this, what do they think? It’s still a version of turning it off. Do not turn off the whole platform, do not shut down the website. But disable the algorithm. Disable likes, the little counter at the bottom of the post that shows you how many people have liked or retweeted it. It’s something that even Jack Dorsey, the former head of Twitter, floated as an idea because he came to see it was so harmful.

But turning off those engagement-maximizing features is something we actually experimented with. And a version of social media like that, I think, could potentially bring a lot of good that [social media] brings, which is real, and mitigates some of the harm.

Copyright 2022 NPR. To learn more, visit https://www.npr.org.