With just two months left until the United States presidential election, Facebook says it is taking more steps to encourage voting, minimise misinformation and reduce the likelihood of post-election “civil unrest”.
The company said on Thursday it will restrict new political advertisements in the week before the election and remove posts that convey misinformation about COVID-19 and voting.
It will also attach links to official results to posts from candidates and campaigns that declare premature victories.
“This election is not going to be business as usual. We all have a responsibility to protect our democracy,” Facebook CEO Mark Zuckerberg said in a post. “That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”
Facebook and other social media companies are being scrutinised over how they handle misinformation, given issues with President Donald Trump and other candidates posting false information and Russia’s continuing attempts to interfere in US politics.
Facebook has long been criticised for not fact-checking political ads or limiting how they can be targeted at small groups of people.
The US elections are just two months away, and with Covid-19 affecting communities across the country, I’m concerned…
With the nation divided, and election results potentially taking days or weeks to be finalised, there could be an “increased risk of civil unrest across the country”, Zuckerberg said.
In July, Trump refused to publicly commit to accepting the results of the upcoming election, as he scoffed at polls that showed him lagging behind Democratic rival Joe Biden.
Trump has also made false claims that the increased use of mail-in voting because of the coronavirus pandemic allows for voter fraud. That has raised concerns over the willingness of Trump and his supporters to abide by election results.
Asked in an interview on CBS News aired on Thursday if he had personally engaged with Trump on his posts about voting, Zuckerberg said he did not think he had recently.
But Zuckerberg said he had had “certain discussions with him in the past where I’ve told him that I thought some of the rhetoric was problematic”.
Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election advertisements in the week before the election. However, they can still run existing advertisements and change how they are targeted.
Posts with obvious misinformation on voting policies and the coronavirus pandemic will also be removed, the company said.
Users can only forward articles to a maximum of five others on Messenger, Facebook’s messaging app. The company will also work with the Reuters news agency to provide official election results and make the information available both on its platform and with push notifications.
After being caught off-guard by Russia’s efforts to interfere in the 2016 US presidential election, Facebook, Google, Twitter and other companies put safeguards in place to prevent it from happening again.
That includes taking down posts, groups and accounts that engage in “coordinated inauthentic behavior” and strengthening verification procedures for political advertisements.
Last year, Twitter banned political advertisements altogether, while Alphabet’s Google limited the ways in which election advertisers could micro-target voters.
Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years.
“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.
But experts and Facebook’s own employees say the measures are not enough to stop the spread of misinformation – including from politicians and in the form of edited videos.
Facebook had previously drawn criticism for its advertisement policy, which cited freedom of expression as the reason for letting politicians like Trump post false information about voting.