In a blog post Wednesday, Facebook said it will no longer allow content that encourages monitoring of polls that use "militarized" language or that seek to "intimidate, exercise control or show power over election officials or voters". Facebook credited the update to its platform rules to civil rights experts it worked with on creating the policy.
Monika Bickert, Vice President for Content Policy at Facebook, explained the new rules in an interview with reporters and stated that the wording prohibits contributions in which words like "army" or "fight" are used – a choice that apparently directly affects the Trump campaign efforts aim to recruit an "Army for Trump" to watch election day elections. Last month, Donald Trump Jr. urged supporters to "now" engage in an "Army for Trump election security operation" in a video posted on Facebook and other social platforms.
"Under the new policy, we would actually remove the video if it were republished," said Bickert.
The company says posts calling for “coordinated interference” or showing up armed at polling stations are already scheduled for removal, but the expanded policy will better address concerns about voter intimidation. Facebook will apply the expanded policy in the future, but it will not affect any content that is already on the platform, including the Trump Jr. post.
Watching polls to ensure fair elections is a regular part of the process, but arming these watchers to seek evidence of unsubstantiated allegations of “fraudulent ballots” and “rigged” voting is something new – and something more like that like intimidating voters. Survey monitoring laws vary by state, and some states limit how many survey observers can be present and how they must identify themselves.
Trump has repeatedly failed to say that he will accept the election results in the event he loses, a position that poses an unprecedented threat to the peaceful transfer of power in the US. That concern is one of many that social media companies and voting proxies are anxious to keep an eye on as election day approaches.
"Donald Trump is not interested in electoral integrity, he is interested in the suppression of voters," said Debra Cleaver, founder of VoteAmerica, of the Trump campaign's efforts to monitor the polls. "Sending armed guards to the polls is a solution to a problem that doesn't exist unless you think voting blacks and browns is a problem."
Facebook is also making some changes to its political advertising rules. The company will stop allowing political ads immediately after the election to avoid chaos and false claims.
“… While ads are an important way to express your voice, we plan to temporarily stop displaying all social, election, or political ads in the US after the polls are completed on November 3rd to reduce the risk of confusion or abuse reduce ”, so the vice president of Facebook of Integrity Guy Rosen wrote in a blog post. Rosen added that Facebook will notify advertisers when these ads will be allowed again.
Facebook also gave a glimpse of what its apps will look like on an unusual election night. The company will place a notification on the top of the Instagram and Facebook apps with the status of choice to fully verify false claims.
These messages remind users that "Votes are still being counted" before switching to a message that "A winner has been projected" after a reliable consensus has been reached on the race. Because the election results may not be visible on election night this year, users may be able to see these messages after November 3rd. If a candidate declares a premature win, Facebook will add one of these labels to that content.
Facebook also noted that it is now using a viral content screening system, a measure designed to prevent, in many cases, misinformation or otherwise harmful content from drawing thousands of views before it is eventually removed. According to Facebook, the tool it relied on "throughout the election season" provides a safety net that the company can use to identify content that violates its rules so it can take action to limit its circulation.
In particular, in the last month leading up to the elections, Facebook is less reluctant to monitor misinformation and other harmful political content on its platform. The company announced Tuesday that it would no longer allow the pro-Trump conspiracy theory known as QAnon to flourish there, as it has for the past four years. Facebook also removed a post this week in which President Trump, who has just been hospitalized for several days, claimed COVID-19 was "far less fatal" than the flu.