Skip to main content

Not stopping 'Stop the Steal:' Facebook Papers paint damning picture of company's role in insurrection

Share

Just days after insurrectionists stormed the Capitol on January 6th, Facebook's Chief Operational Officer Sheryl Sandberg downplayed her company's role in what had happened.

"We know this was organized online. We know that," she said in an interview with Reuters. "We... took down QAnon, Proud Boys, Stop the Steal, anything that was talking about possible violence last week. Our enforcement's never perfect so I'm sure there were still things on Facebook. I think these events were largely organized on platforms that don't have our abilities to stop hate and don't have our standards and don't have our transparency."

But internal Facebook documents reviewed by CNN suggest otherwise. The documents, including an internal post-mortem and one document showing in real time countermeasures Facebook employees were belatedly implementing, paint a picture of a company that was in fact fundamentally unprepared for how the Stop the Steal movement used its platform to organize, and that only truly swung into action after the movement had turned violent.

Asked by CNN about Sandberg's quote and whether she stood by it, a Facebook spokesperson pointed to the greater context around Sandberg's quote. She had been noting that Jan. 6 organization happened largely online, including but not limited to on Facebook's platforms, the spokesperson said.

The documents were provided by Facebook whistleblower Frances Haugen as evidence to support disclosures she made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen's legal counsel. The redacted versions were obtained by a consortium of 17 U.S. news organizations, including CNN.

One of Haugen's central allegations about the company focuses on the attack on the Capitol. In a SEC disclosure she alleges, "Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection."

Facebook denies the premise of Haugen's conclusions and says Haugen has cherry-picked documents to present an unfair portrayal of the company.

"The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them. We took steps to limit content that sought to delegitimize the election, including labeling candidates' posts with the latest vote count after Mr. Trump prematurely declared victory, pausing new political advertising and removing the original #StopTheSteal Group in November," Facebook spokesperson Andy Stone told CNN Friday.

"After the violence at the Capitol erupted and as we saw continued attempts to organize events to dispute the outcome of the presidential election, we removed content with the phrase 'stop the steal' under our Coordinating Harm policy and suspended Trump from our platform."

Facebook also on Friday night published a blog post by its vice president of Integrity, Guy Rosen, about its efforts around the 2020 election.

"Our enforcement was piecemeal"

Among the tens of thousands of pages of documents Haugen provided is an internal analysis of how the Stop the Steal and Patriot Party movements spread on Facebook, first reported by BuzzFeed News earlier this year.

"Hindsight is 20:20," the author or authors of the analysis, who are not identifiable from what was provided, write. "[A]t the time it was very difficult to know whether what we were seeing was a coordinated effort to delegitimize the election, or whether it was protected free expression by users who were afraid and confused and deserved our empathy. But hindsight being 20:20 makes it all the more important to look back to learn what we can about the growth of the election delegitimatizing movements that grew, spread conspiracy, and helped incite the Capitol insurrection."

The analysis found that the policies and procedures Facebook had in place were simply not up to the task of slowing, much less halting, the "meteoric" growth of Stop the Steal. For instance, those behind the analysis noted that Facebook treated each piece of content and person or group within Stop the Steal individually, rather than as part of a whole, with dire results.

"Almost all of the fastest growing FB Groups were Stop the Steal during their peak growth," the analysis says. "Because we were looking at each entity individually, rather than as a cohesive movement, we were only able to take down individual Groups and Pages once they exceeded a violation threshold. We were not able to act on simple objects like posts and comments because they individually tended not to violate, even if they were surrounded by hate, violence, and misinformation."

This approach did eventually change, according to the analysis -- after it was too late.

"After the Capitol insurrection and a wave of Storm the Capitol events across the country, we realized that the individual delegitimizing Groups, Pages, and slogans did constitute a cohesive movement," the analysis says.

This was not the only way in which Facebook had failed to anticipate something like Stop the Steal, or in which its response was lacking.

Facebook has for some time now had a policy banning "coordinated inauthentic behavior" on its platforms. This ban allows it to take action against, for instance, the Russian troll army that worked to interfere with the 2016 US election through accounts and pages set up to look as if they were American But, the analysis notes with emphasis, the company had "little policy around coordinated authentic harm" -- that is, little to stop people organizing under their real names and not hiding their intention to get the country to reject the results of the election.

Stop the Steal and Patriot Party groups "were not directly mobilizing offline harm, nor were they directly promoting militarization," the analysis says. "Instead, they were amplifying and normalizing misinformation and violent hate in a way that delegitimized a free and fair democratic election. The harm existed at the network level: an individual's speech is protected, but as a movement, it normalized delegitimization and hate in a way that resulted in offline harm and harm to the norms underpinning democracy."

The analysis does note, however, that once Facebook saw the results of Stop the Steal on January 6th and took action, it was able to deploy measures that stymied the growth of both Stop the Steal and Patriot Party groups.

Facebook's Stone told CNN, "Facebook has taken extraordinary steps to address harmful content and we'll continue to do our part. We also closely worked with law enforcement, both before January 6th and in the days and weeks since, with the goal of ensuring that information linking the people responsible for January 6th to their crimes is available."

Pulling levers

Haugen began gathering evidence about the company before she eventually left the tech giant last May. To reduce the chance of getting caught taking screenshots of internal Facebook systems, she used her phone to take photographs of her computer screen.

As the insurrection was underway in Washington and Facebook was trying to get a handle on the situation, Haugen was snapping pictures, documenting the company's response.

One of the documents she captured, titled "Capitol Protest BTG [Break the Glass] Response," was a chart of measures Facebook could take in response to the January 6th attack. The chart appears to have been prepared beforehand; at the time Haugen photographed it, a little less than two hours after the Capitol was first breached, the company had instituted some of those measures while others were still under consideration. Among the potential actions listed in the chart were demoting "content deemed likely to violate our community standards in the areas of hate speech, graphic violence, and violence and incitement."

The page labeled these as "US2020 Levers, previously rolled back."

Those "levers," as Facebook refers to them, are measures -- guardrails -- that the company put in place before last year's Presidential election in an attempt to slow the spread of hate and misinformation on the platform. Facebook has not been clear in its public statements about what measures it did roll back after the election and why it did so at a time of tumult when the sitting president was calling the results of the vote into question.

But according to the "Capitol Protest BTG response" document, the guardrails Facebook reimplemented on January 6th included reducing the visibility of posts likely to be reported and freezing "commenting on posts in Groups that start to have a high rate of hate speech and violence & incitement comments," among others.

In the SEC disclosure, Haugen alleges that these levers were reinstated "only after the insurrection flared up."

Asked about the decisions to dial the levers back and then push them out again, Stone said, "In phasing in and then adjusting additional measures before, during and after the election, we took into account specific on-platforms signals and information from our ongoing, regular engagement with law enforcement. When those signals changed, so did the measures."

A through line

When Facebook executives posted messages publicly and internally condemning the riot, some employees pushed back, even suggesting Facebook might have had some culpability.

"There were dozens of Stop the Steal groups active up until yesterday, and I doubt they minced words about their intentions," one employee wrote in response to a post from Mike Schroepfer, Facebook's chief technology officer.3

Another wrote, "All due respect, but haven't we had enough time to figure out how to manage discourse without enabling violence? We've been fueling this fire for a long time and we shouldn't be surprised it's now out of control."

Other Facebook employees went further, claiming decisions by company leadership over the years had helped create the conditions that paved the way for an attack on the US Capitol.

Responding to Schroepfer's post, one staffer wrote that, "leadership overrides research based policy decisions to better serve people like the groups inciting violence today. Rank and file workers have done their part to identify changes to improve our platforms but have been actively held back."

Another staffer, referencing years of controversial and questionable decision-making by Facebook leadership around political speech concluded, "history will not judge us kindly."

CTVNews.ca Top Stories

Hertz CEO out following electric car 'horror show'

The company, which announced in January it was selling 20,000 of the electric vehicles in its fleet, or about a third of the EVs it owned, is now replacing the CEO who helped build up that fleet, giving it the company’s fifth boss in just four years.

Stay Connected