Facebook has announced it will take "stronger" action against users repeatedly sharing misinformation on the social media platform.
Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company's fact-checking partners, the social media giant said in a blog post.
It added that it was also launching ways to inform people if they are interacting with content that has been rated by a fact-checker.
False claims and conspiracies have proliferated on social media platforms, including Facebook and Twitter, during the COVID-19 pandemic.
"Whether it's false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we're making sure fewer people see misinformation on our apps," the company said in a statement.
Earlier this year, Facebook said it took down 1.3 billion fake accounts between October and December, ahead of an inspection by the U.S. House Committee on Energy and Commerce into how technology platforms are tackling misinformation.
Please click to read our informative text prepared pursuant to the Law on the Protection of Personal Data No. 6698 and to get information about the cookies used on our website in accordance with the relevant legislation.
6698 sayılı Kişisel Verilerin Korunması Kanunu uyarınca hazırlanmış aydınlatma metnimizi okumak ve sitemizde ilgili mevzuata uygun olarak kullanılan çerezlerle ilgili bilgi almak için lütfen tıklayınız.