Facebook has been under fire since the last U.S. presidential elections, as many believed fake news in social media had a pretty sizable impact on Trump's victory over Clinton. Critics divided the blame between accusing Russia of rigging the elections, while holding Facebook and other social media sites responsible for willingly or unwillingly providing an exploitable platform.
In the fallout, Facebook said it was adamant about combatting fake news spread on its network by propaganda accounts and fake news sources. On Oct. 9, 2017, a Reader's Corner article titled, "A self-remedy from Facebook," outlined some of the few steps the social media giant had taken to tackle this issue. They included: "Better identifying false news through our community and third-party fact-checking organizations so that we can limit its spread, which, in turn, makes it uneconomical. Applying machine learning to assist our response teams in detecting fraud, and enforcing our policies against inauthentic spam accounts. Updating our detection of fake accounts on Facebook, which makes spamming on a scale much harder."
Another step was to implement a set of guidelines for the user base to spot fake news more effectively and stop being the part of the problem by spreading them unknowingly.
On Nov. 22, 2017, Facebook, announced in a blog post titled, "Continuing Transparency on Russian Activity," a different step they were planning to take. The post promised that the site was willing to implement a tool for the users to determine whether they ever encountered or spread fake news produced by the "Internet Research Agency," a Russian company that gained worldwide notoriety in 2015 as the "Russian trolls," striving to promote Russian interests in a number of issues through propaganda and information manipulation on the internet.
Facebook came through with its promise last week when it launched and outlined user interactions with Internet Research Agency content. The tool also allowed them to vet their Instagram accounts.
Currently, the tool is not available worldwide and the guidelines mentioned earlier were also initially not available worldwide. It was recently offered in some other countries with local partnership. In Turkey, Facebook decided to work with Kadir Has University and perhaps we will see a similar practice worldwide.
This can be considered a step in the right direction, because similar to malicious sources, those who produce fake news should be exposed in order curb the spread. Fighting individual fake stories are often a futile effort, because a lie can travel halfway around the world while the truth is putting on its shoes. Yet Facebook should not limit itself to Russian or foreign sources only and take similar steps for domestic sources. This is a slippery slope that will depend heavily on editorial integrity and objectivity. Otherwise, it can easily turn into a story of who has the better propaganda machine, instead of taking a unified stand against malicious information manipulation.