As we know, Social media has a tremendous effect on the minds of the masses and they deploy algorithms to deliver personalised content to users that play a large role in making sure those users encounter only information that agrees (positively/negatively) with their existing beliefs.
A study by the Kellogg School of Management looked at the behaviour of 12 million social media accounts to explore users' behaviour in politically polarised online environments.
Specifically, the research team looked at the “likes,” shares, and comments garnered by particular videos that were hosted on Youtube but also embedded on 413 different Facebook pages.
The results are that these online content consumers become more polarised over time, even those that held two opinions at once quickly became more polarised in their views. This generally happened after the fiftieth comment (according to the research).
And, the thing that makes people's thinking evolve is an even more polarising point of view, so the echo chamber becomes like quicksand drawing you in deeper. I call this the Quicksand Effect.
What About Debunking?
According to another study by the same team of researchers debunking actually leads people to strengthen their polarising beliefs. Go figure!
So, what's the answer? There is clearly no 'one size fits all' approach but education is a proven method to challenge people's thinking prior to sharing, commenting etc... We also need to challenge platforms to do more to limit their algorithms from enabling this indoctrination.
Alessandro Bessi
Fabiana Zollo
Michela Del Vicario
Michelangelo Puliga
Antonio Scala
Guido Caldarelli
Brian Uzzi
Walter Quattrociocchi
Reference
Author: Greig Dowling