Controlling our news feeds: 'Echo chambers' and 'filter bubbles'

İBRAHIM ALTAY
Published
Illustration by Necmettin Asma
Illustration by Necmettin Asma

As we increasingly surround ourselves with arguments that support our own opinions alone, we inevitably face the risk of being stuck in an echo chamber and make ourselves more susceptible to fake news

Whether it is about making a quick buck from advertising revenues or furthering personal or collective political goals through unending and inaccurate propaganda, fake news continues to be a major problem in the age of social media.

Although the issue is well-covered in the media, modern society seems nowhere near finding a solution. One of the main reasons for this is that fake news has managed to either trigger self-assurance or justified anger in readers, becoming a desirable alternative for those who find accurate news disagreeable.

On Nov. 10, as Turkey was commemorating the anniversary of the death of its founder, Mustafa Kemal Atatürk, fake news sources in the country managed to do exactly this: Stir up controversy with retouched photos, false claims and similar propagation of disinformation that was sure to galvanize the public. Two such examples were caught by teyit.org, a social enterprise that does fact checking on commonplace claims floating in social media, to determine accuracy.

Turkey has not been the only victim of this disturbing new phenomenon, as false quotes, fabricated news and fake photos also flooded social media feeds in the U.S. ahead of Veterans' Day on Nov. 11.

The pattern of triggering happiness or anger resembles the methods of running a con, an inaccurate comparison to make with fake news. In the case of examples seen out of Turkey, the driving motivation behind such method is anger. With the echo chambers in place, that anger only grows and by the time a level-headed approach is taken for discovering the true nature of the news in question, it is already too late.

What is an echo chamber though? In this case, there is one definition that matters. In news and media, an echo chamber is a situation where only certain sources that support the beliefs of a group are accepted, amplified and reinforced through repetition, forming a subjective worldview or argument as an absolute fact.

When it comes to social media, the definition of echo chambers changes slightly but its core remains the same. The most important difference is that it becomes a lot easier to form an echo chamber than a public sphere of influence, where only information that we approve or beliefs that we support can exist.

After all, haven't you ever encounter posts from your friends on your social media feeds that compel you to unfriend them on the basis of subjective, political disagreements in response to their posts? Haven't you unfriended a person for such disagreements? Or, have you been unfriended for similar reasons? All of these actions pave the way for the formation of echo chambers, also making us more vulnerable to fake news.

This is especially true when it comes to fake news which supports the promotion of self-assuredness. After all, it is even more difficult to cast doubt on a piece of news that confirms or supports your political argument while vilifying those that do not. In the homogeneous sphere of social interactions and as a result of these echo chambers, we are much less likely to find someone critical and therefore suspicious of the information provided by the fake news piece.

Research published on Oct. 31 delved into this issue, as well as the larger scope of fake news and similar problems in the media, combining all under the umbrella of "information disorder." Regarding the subject of echo chambers, the research indicated that "The 'public sphere' is the shared spaces, either real, virtual or imagined, whereby social issues are discussed and public opinion is formed. This theory was first shared by German sociologist and philosopher Jurgen Habermas, who argued that a healthy public sphere is essential for democracy and must be inclusive, representative and characterized by respect for rational argument. The most significant challenge to any theory of a shared public sphere is that humans, when given the choice of who to connect with or who not to connect with, tend to establish and continue relationships with people who have views which are similar to our own. We are programmed to enjoy spending time in 'echo chambers,' as it requires less cognitive work."

Prepared by Claire Wardle, Ph.D. and Hossein Derakhshan supported by the research of Anne Burns and Nic Dias, the study in question was published under the title of "Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking" by the Council of Europe (CoE), with the support of the Shorenstein Center on Media, Politics and Public Policy at Harvard Kennedy School and First Draft.

Another important conclusion that can be drawn from the research is that digital and social media present another fundamental roadblock to any possible effort to diversify these echo chambers in the form of filter bubbles. As you all certainly already know that the content you see on the web and your personal experiences vary depending on your habits, this is especially the case when it comes to social media, as it takes stock of your interests and hides some of the content that is outside of those interests, while promoting specific types of information in order to keep you engaged. This, of course, bears some important ethical considerations even if we disregard the privacy concerns of individuals.

These algorithms are now commonplace on nearly every web page that supports its structure; from e-commerce sites to numerous video-sharing websites. Even the "related articles" section of news websites, seen at the bottom of news articles, form a filter bubble despite being based on subjects rather than your actual activity on a website.

Regardless, social media can be considered as one of the worst offenders when it comes to the implementation of filter bubbles, as it also manages to tap into society's tendency to form a controlled environment with only likeminded individuals.

According to the said research, "The fundamental problem is that 'filter bubbles' worsen polarization by allowing us to live in our own online 'echo chambers,' leaving us only with the opinions that validate, rather than challenge, our own ideas. While confirmation bias occurs offline and the term 'selective exposure' has been used by social scientists for decades to describe how information-seekers use only those certain sources that share or purport their views, social media is designed to take advantage of this innate bias."

Since the latest presidential election results in the U.S., social media and especially Facebook have come under fire amid suspicions that the high concentration of fake news originating on social media was one of the chief architects of the Trump victory.

In January, Facebook stated that it had removed the implementation of personalized content from its "Trending Topics" section. Also, Facebook decided to revert its "Related Articles" section of the news articles shared on its website back to the old method of showing similar articles. The new approach explicates that this section will aim to bring together varying viewpoints of the same subject in order to break the "echo chambers."

If we are to properly understand the origins of information and our inability to deal with fake news in a concrete manner, we must first understand their continued popularity apart from the constant guidelines, criticisms and steps being taken to combat them.

Hopefully, this research will be an important step toward realizing that the problem with fake news does not solely lie in the readers, the mainstream media or social media, rather involving all three of these sources. It will take active effort from all three to stop the spread of misinformation.

Share on Facebook Share on Twitter