Social media firms were on high alert Tuesday against Election Day misinformation and manipulation efforts as voters began in-person balloting across the United States following a tumultuous campaign.
Aiming to avoid the problems that occurred in the 2016 campaign, Facebook, Twitter and Google-owned YouTube were implementing policies aimed at heading off the spread of false information designed to influence or intimidate voters at the last minute.
Facebook said it had activated its election hub which was monitoring the platform in real time.
"Our Election Operations Center will continue monitoring a range of issues in real time - including reports of voter suppression content," said a Facebook statement posted on Twitter.
"If we see attempts to suppress participation, intimidate voters, or organize to do so, the content will be removed."
Facebook said its election center was also tracking other issues such as the actions by supporters of President Donald Trump to surround campaign buses for Democrat Joe Biden.
"We are monitoring closely and will remove content calling for coordinated harm or interference with anyone's ability to vote," Facebook said.
Facebook reiterated that it would place warning labels on any posts which seek to claim victory prematurely.
"If a presidential candidate or party declares premature victory, we will add more specific information in the labels on candidate posts, add more specific information in the top-of-feed notifications and continue showing the latest results in our Voting Information Center," the social giant said.
Along with other social platforms, the company has promised to stem misinformation around the election, including premature claims of victory, seeking to avoid a repeat of 2016 manipulation efforts.
Over the past days, Facebook and Twitter added disclaimers to Trump posts calling into question the integrity of mail-in ballots.
Trump's post on Twitter said a slow vote count in battleground state Pennsylvania could lead to "rampant and unchecked cheating."
"It will also induce violence in the streets. Something must be done!" he tweeted.
Twitter last month updated its "civic integrity policy" aiming to prevent efforts to manipulate or interfere in elections, which calls for action against false claims of victory or any incitement of violence.
YouTube has also sought to limit the sharing of videos with election misinformation. Last month it began adding information panels to videos about voting by mail.
Mail-in voting was added to a short list of topics that YouTube considers prone to posts containing falsehoods, such as COVID-19 and the Moon landing, according to the Google-owned video sharing platform.
The panel appears under such videos regardless of who is speaking or who uploads them, according to YouTube.
Some activists noted that efforts by social platforms to curb the spread of false information was being hurt by loopholes and glitches.
The activist group Avaaz said it has found multiple examples of unverified election claims on Facebook in recent days.
Some comments say the "left" is planning a "coup" if Trump wins, while others argued without any factual basis that Trump would need to win Pennsylvania by four to five points "to overcome voter fraud," according to Avaaz.
Facebook acknowledged this week that some political ads banned for containing misleading information were resurfacing, with political groups copying the same content for new messages to slip through filters.
Please click to read our informative text prepared pursuant to the Law on the Protection of Personal Data No. 6698 and to get information about the cookies used on our website in accordance with the relevant legislation.
6698 sayılı Kişisel Verilerin Korunması Kanunu uyarınca hazırlanmış aydınlatma metnimizi okumak ve sitemizde ilgili mevzuata uygun olarak kullanılan çerezlerle ilgili bilgi almak için lütfen tıklayınız.