The prevalence of fake articles and shady news distribution sites in Facebook have made waves over the recent US elections for its hand in influencing the people’s vote, leading to highly debated and widely unexpected results. Fake News on Facebook has triggered an online outrage for users who think this has had a tremendous impact in the controversial US elections results. The distribution of fake news throughout social media and primarily through Facebook where 67 percent of Americans interact everyday is a worthy factor to consider. But is it right to say that Fake News Sites are all to blame?
Facebook: Fake News, Echo Chambers, and News Algorithms
Many believe that with billions of Facebook users turning to the site for world updates and information, it is impossible to think otherwise. A big bulk of today’s information come from social media and as more and more people turn to Facebook as their main source for daily news, the biggest challenge lies in knowing how to control the influx of fake news and get widespread support in promoting only accurate and trustworthy content. However, taking a closer look at what makes fake news so damaging brings into mind the belief that it is, in fact, the Echo Chamber effect in Facebook that plays a significant role in why things are what they are.
Many experts have confirmed that the Echo Chamber effect in Facebook is very much alive. For those who are unfamiliar with the term, what exactly is the Echo Chamber effect?
The premise of the Echo Chamber Effect lies in Facebook making use of algorithms to learn and identify what types of content it’s users like so it can push these types of content to their newsfeed. Over time, Facebook users may have noticed how smart their newsfeed has become and how it continually evolves to promote content according to what they like, share and engage with. Although this is great at being able to showcase and highlight only a user’s top interests, this changes dramatically when beliefs and political affiliations come into question. Many fear that this same algorithm may instantly filter out content that disagrees with your own set of beliefs or affiliations. Having access to opposite mindsets and information could have been able to help you gain a new perspective or allow you to learn something different. This is the heavily debated downside to the echo chamber effect in Facebook.
How does the echo chamber effect work?
Social scientists Antonio Scala, Walter Quattrociocchi and Cass Sunstein believe that there is enough evidence to prove the downside of how echo chambers work in social media and primarily in Facebook. They have found many social media users promoted and projected their own beliefs, created their own social groups based on their own ideologies and oftentimes secluded everyone else who did not hold the same opinion. They further found out that many Facebook users had a tendency to only interact with friends who shared the same beliefs. They also heavily promoted content in the form of various narratives that held these same beliefs throughout their own communities. Driven by confirmation bias, these are the echo chambers that form throughout Facebook’s social network which push for the widespread sharing of views and belief systems in the form of news and articles.
Another observation made was that users often rejected information that does not coincide with their belief systems. For the most part, they only seek out news and other information that follow the same narrative and make the most of social media to share and promote these to validate their own beliefs. Perhaps the most alarming aspect about the echo chamber effect in social media is that when false information is introduced, people continue to accept it as credible sources as long as it supports their own narratives. With confirmation bias, even the more obvious credible sources were ignored for the false information and news that supported their claims instead.
It is not surprising to see many conspiracy theories rehashed and reused and now making its way around Facebook once again because of the veracity by which these types of information are promoted and shared in groups and pages to support various narratives. New conspiracy theories continue to emerge and easily get traction through social media’s highly interconnected networks. They continue to gain widespread attention and are often used to bring into focus different belief systems that are questioned with the popularity of various world events. We have indeed reached the age of unlimited information where almost anyone online has access to whatever they need.
Now that information is easily accessible, social media has more than doubled its reach through its massive user base and primarily through the inner workings of Facebook’s echo chamber effect. Many would ask if there is a better way to control the spreading of fake news being shared in social media and to ensure only facts can make their way through feeds.
One of the biggest problems that Facebook faces now is the controversy on Trending news. Many have thought that Facebook may have indirectly tampered with how news listings were arranged for their own benefit. There is potential danger in this either bringing major problems into focus or pushing sensitive issues out into the open which nay cause more of a divide. In today’s society where most people get their news through Facebook, the concept of having them in control over what people see on their feeds is a scary thought especially with many people unable to identify what is real from fake news.
Understanding how the echo chamber effect works in Facebook is important for marketers as well who need to understand what types of content work with their audience and what may be pushed to the back and willfully ignored. For marketers to be able to surpass the adverse effects of echo chambers in Facebook, it is important for them to create content to cater to specific audiences to ensure they improve their reach and make the most of user engagement by encouraging interaction with different groups in their audience.