Facebook is planning for possible trauma around the November 3rd US presidential election; with internal tools it’s used before in countries like Sri Lanka and Myanmar.
The plans may include slowing the spread of posts as they begin to go viral; altering the news feed algorithm to change what content users see; and changing the rules for what kind of content is dangerous and warrants removal.
They’re strategies Facebook has previously used in so-called “at-risk” countries; dealing with mass ethnic unrest or political bloodshed.
The tools would only be used in the event of election-related violence or other serious circumstances; according to the WSJ, but some employees at the company said they were concerned that attempting to slow down viral content could unintentionally hide legitimate political discussions.
Facebook’s handling of violent hate speech against Rohingya Muslims in Myanmar several years ago was widely criticized.
After a 2018 independent assessment of the situation, the social media giant conceded it wasn’t “doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.” It pledged to better prepare for future risks.
Facebook CEO, Mark Zuckerberg said in a September blog post that the US presidential election “is not going to be business as usual.” He said he was “worried that with our nation so divided and election results potentially taking days or weeks to be finalized, there could be an increased risk of civil unrest across the country.”
Platforms are bracing for pre-and post-election uncertainty in the US, after President Trump has repeatedly criticized mail-in voting, which many people are using this election cycle due to the coronavirus pandemic. He’s also declined to say whether he would accept the election results if he loses.