In addition to the mere one-week duration, though, there’s a caveat there. Only new advertisements will be refused; if political groups already have ads in the Facebook pipeline, they will be allowed to boost spending on and visibility of those ads during the final week. Facebook will be showing plenty of political ads in that final week; what the one-week ban provides is a small buffer zone in which Facebook doesn’t have to scurry to deal with brand new false claims being disseminated by a campaign or allied groups.
Given the network’s extremely slow responses when political groups are proven to have made false claims, this doesn’t mean false political claims won’t appear in the election’s last week. It just means they have to have been in the queue beforehand. Another reason for skepticism given Facebook’s historic inability or reluctance to enforce its own rules: It’s not immediately clear whether this new policy will be followed any more rigorously than any of the others.
Perennial rulebreakers like Ben Shapiro and his network of faux-aggregators, for example, will still be receiving the company’s seemingly personalized protections.
Other new policies announced, as reported by both The Washington Post and The New York Times: Messages forwarded from the company’s Messenger app will be limited to five people, limiting the flow of friend-of-a-friend-of-a-friend chains that so readily propagate hoaxes. An expansion of existing policy will either label or remove not just posts that explicitly discourage Americans from voting, but ones that give false or misleading information about voting eligibility, voting-related pandemic dangers, or otherwise impede voting rights.
And in the days and weeks after the election while votes are still being counted, candidates’ claims of winning the election, or of fraud, will be tagged with a label telling readers where to get the current, official results. This one appears to be handpicked to deal with Donald J. Blowhard Trump’s propagation of “fraud” conspiracy theories—but won’t remove the false claims. It will just label them.
Skepticism, then, is in order. Facebook head Mark Zuckerberg, as well as top-level company executives with unapologetic arch-conservative leanings, have been gaming the site’s own rules to shelter conservative disinformation, stifle proper fact-checking, and have been throughout the company’s history brazenly apathetic towards the company’s now well-proven role as a political disinformation and propaganda hub. The company has been facing increased public pressure, as well as outright public anger, however, at its role in enabling “Q” conspiracies, militia groups, Russian government-produced hoaxes, and worldwide violence.
These new rules are mostly intended to distance the social network from whatever new election eve political or intelligence scandals might crop up in the days immediately before and after the election. Whether they are effective in actually hindering such efforts remains to be seen.