July 28, 2021

DCTRS

Damascus Center for Theoretical and Civil Rights Studies

Social media slow to limit users protesting vote-counting

Efforts by President Trump and his supporters to sow doubt in the integrity of election results continue to pose challenges for Facebook and other social media companies, as the vote-counting process remains underway.

Twitter has worked to restrict tweets from the president that spread false narratives undermining the election, flagging many of his posts as potentially misleading, but baseless claims of election fraud have continued to percolate across the platform at speed.

On Facebook, Trump’s supporters are picking up the president’s message and running with it, prompting the platform to announce new measures to tamp down on election misinformation Thursday afternoon.

A Facebook group called Stop the Steal 2020, which attracted more than 360,000 members before it was suspended by the social giant Thursday, provided a notable example of how the false narrative is spreading.

The group was created Wednesday by Women for America First, a political nonprofit founded in 2019 to organize protests against the impeachment proceedings against Trump. The group has gone on to organize anti-lockdown protests across the country.

The “Stop the Steal” phrase first gained traction on social media on election day and was boosted that night, when Trump tweeted the unsubstantiated claim that “they” are “trying to STEAL the election.” On Wednesday, variations on the theme could be heard on the streets of U.S. cities where votes were being counted. In Detroit, Trump supporters gathered at a vote-counting place, chanting at election workers to “stop the count.” That night, a crowd gathered outside a Phoenix election center, chanting “count the votes” and “stop the steal,” some openly carrying firearms.

Posts claimed without evidence that big chunks of votes were missing or invalid, accused poll workers of being biased against conservatives, or implied that it’s taken several days to count votes because of malicious activity by Democrats.

As the Facebook group’s membership ticked into the hundreds of thousands of users, the group’s creators asked members to provide contact information, anticipating that the platform might soon shut it down.

New members were prompted to enter their contact information at a separate website before being allowed to join. There, members were greeted with a message pushing the conspiracy theory meant to undermine the election: “Democrats are scheming to disenfranchise and nullify Republican votes. It’s up to us, the American People, to fight and to put a stop to it.” After asking for members’ names, emails, and states, the website also asked for a donation to support the cause.

Facebook suspended the group before 11 a.m. Thursday.

“In line with the exceptional measures that we are taking during this period of heightened tension, we have removed the Group ‘Stop the Steal,’ which was creating real-world events,” a Facebook spokesperson said in a statement. “The group was organized around the delegitimization of the election process, and we saw worrying calls for violence from some members of the group.”

The social media company declined to answer additional questions about the timing of the decision, but it broadly aligns with election-related misinformation policies that Facebook laid out in preparation for what was expected to be a long vote-counting process.

Social media posts and other online media with hashtags or phrases pushing the false narrative that the election was being stolen from the president surged online over the course of the day Wednesday, peaking before 9 p.m. Pacific before seeing another bump Thursday morning, according to the media intelligence platform Zignal Labs.

The phrase “Stop the Steal” itself was mentioned more than 1.2 million times over the same time period, and continued to spread on Twitter on Thursday afternoon. In a statement, the company said that it was “proactively monitoring” the hashtag and that it was flagging specific tweets that contained misinformation. Twitter declined to provide a rationale as to why the hashtag itself — which asserts that there is a theft underway — did not qualify to be flagged across the board.

Twitter’s civic integrity policy says that it plans to take action on “misleading information about election outcomes,” which includes “disputed claims that could undermine faith in the process itself, such as unverified information about election rigging, ballot tampering, vote tallying, or certification of election results.”

Kat Lo, a researcher who studies online content moderation at the nonprofit Meedan, said one of the big challenges of the election is when misinformation is ambiguously presented, such as the type of claims made in the Stop the Steal group. For example, one user posted a screenshot of a ballot tracker, showing their ballot had not been received, and implied it was part of some widespread conspiracy.

“When a post can be feasibly contested, platforms have a difficult time deciding how to police that type of thing,” Lo said.

Of the tech platforms, Lo said, Twitter has the most well-developed strategy for dealing with misleading information.

Twitter slaps on fact-checking links and in some cases places a screen entirely over an offending tweet. By contrast, Facebook still centers content that it flags as questionable. While Twitter has moved to block retweets, Facebook does not prohibit sharing.

“Preventing information from being as shareable, as viral, is a crucial part of harm mitigation,” Lo said.

YouTube, she said, has struggled to develop and enforce a policy to address gray area content.

A review by the nonprofit Media Matters found that YouTube videos pushing misinformation about the results of the 2020 presidential election have received more than 1 million views, despite the platform’s community guidelines prohibiting “content that aims to mislead people about voting.”

After former presidential advisor and right-wing commentator Steve Bannon called for violence against FBI Director Christopher A. Wray and federal pandemic expert Anthony Fauci, YouTube deleted that episode from his channel; Twitter suspended his account outright.

”I’m most impressed with Twitter, Facebook is doing alright, YouTube is doing awful,” Lo said.

YouTube spokesperson Alex Joseph said the company has a three-strikes policy before an account is terminated. Though the channel is still live, because the account received a strike, Bannon will be temporarily disabled uploading for at least a week.

“We’ve removed this video for violating our policy against inciting violence. We will continue to be vigilant as we enforce our policies in the postelection period,” Joseph said in an email.

Even if platforms are being more proactive, much of their efforts focus on high-profile figures, and it’s “anybody’s guess” if they are being as proactive with moderating the rest of their users, Lo said. Furthermore, posts by high-profile figures that are ambiguous and thus evade moderation efforts, can still do a lot of damage.

In response to the flood of misinformation around the election, Facebook said it would implement new tools and “demote” content on Facebook and Instagram that the company’s system predicts may be misinformation, including debunked claims about voting. The company also said it would limit the distribution of live videos on Facebook that relate to the election.