Facebook will start adding new labels and information to posts about voting and the election, as it attempts to tackle misinformation and other issues ahead of November.
The changes are intended both to encourage people to vote and to ensure they do so with reliable information, combatting misleading stories about the election. Social networks have already rushed to stop the spread of false stories about issues such as mail-in voting – which many US citizens will be doing for the first time – including adding notes to posts by Donald Trump.
The newly introduced tools include a hub for information on voting that borrows from Facebook’s efforts to stop false information around coronavirus. The Covid-19 hub has been seen by billions, Facebook has said, and the new voting tools will take largely the same form.
Despite its efforts, Facebook continues to face widespread criticism around how it handles misinformation around elections and other matters. The company has generally refused to fact-check ads by politicians, for instance, and a two-year audit of its civil rights practices faulted the company for leaving U.S. elections “exposed to interference by the President and others who seek to use misinformation to sow confusion and suppress voting.”
The effectiveness of such labels will depend on how well Facebook’s artificial intelligence system identifies the posts that really need them, said Ethan Zuckerman, director of the Massachusetts Institute of Technology’s Center for Civic Media. If every post containing the word “vote” or “voting” gets an informational link, he said, “people will start ignoring those links.”
Facebook expects the voter hub to reach at least 160 million people in the U.S., said Emily Dalton Smith, who serves as head of social impact at the company. The primary focus is registering people to vote, she said, but the information people see will evolve throughout the election season.
“This is a unique election and a unique election season,” she said. “Certainly we have never gone through an election during a global pandemic.”
Other tech companies, Twitter and Google, which owns YouTube, have undertaken similar efforts around the November election. Twitter said it is working on expanding its policies to address “new and unique challenges” related to this year’s elections, including misinformation around mail-in voting.
Looking ahead to November, Facebook said it is “actively speaking with election officials about the potential of misinformation around election results as an emerging threat.”
The company did not give details on the potential threats, but said that a prolonged ballot process where results are not immediately clear “has the potential to be exploited in order to sow distrust in the election outcome.”
“One way we plan to fight this is by using the Voting Information Center and the US Elections digest in Facebook News to make sure people have easy access to the latest, authoritative information and news on and after Election Night,” Naomi Gleit, vice president of product management and social impact, wrote in a blog post.
Additional reporting by agencies