Facebook: Our Comprehensive Approach to Protecting the 2020 U.S. Elections Through Inauguration Day
- Our comprehensive strategy to protect the 2020 US election began years before the electoral cycle even began and was designed to last until the inauguration. This included the implementation of some temporary measures which were only a small part of our larger strategy.
- We expanded our policies in 2020 to disrupt our service militias and have since banned over 890 militarized social movements. We have also put in place rules prohibiting QAnon and militias from organizing on our platform.
- As with any major public event, the debate over election results would inevitably appear on Facebook. But the responsibility for the insurgency lies with those who broke the law in the attack and those who instigated them, not with how we implemented just one set of measures we took to protect. the American elections.
Long before the US election period began last year, we expected the 2020 election to be one of the most controversial in history – and that was before we even knew that it would take place in the midst of a pandemic. We’ve been working since 2016 to invest in people, technology, policies and processes to make sure we were ready, and started our planning for the 2020 elections themselves two years in advance. We built our strategy until inauguration day in 2021, knowing that there was a high probability that the election results would be challenged. So we have planned specifically for this scenario. This electoral planning relied on all of our other integrity work and investments we have made since 2016.
- Build a global team of 35,000 people to work on safety and security. Today, they are more than 40,000.
- Enabling more people to participate, including helping register more than 4.5 million voters in the United States last year
- Activate our Election Operations Center, made up of Facebook subject matter experts to identify and respond to the various threats we see on the platform – and keep it in place for the US election and throughout the post-election period. election until after the inauguration
- Pursue covert influence operations that sought to interfere with our platform, similar to what happened in 2016. As a result, we have removed networks targeting the United States, including five networks engaged in coordinated inauthentic behavior from Russia, five from Iran, one from China, and five home networks of American origin.
- Expand our policies and investments to eliminate militias and prevent QAnon from organizing on our platform
- Take additional measures to control the virality of potentially harmful content. These include things like demoting content that our systems predict to be violence and incitement. They also include the temporary reduction in distribution of content that our systems predict may be wrong pending review by a third-party fact checker.
- Phasing out of the News Feed ranking models starting at the end of 2019, which we expect the viewer to share again and with which other people will engage. These changes targeted content that was supposed to relate to politics or social issues and they remain in place today.
- Voter Suppression Content Ban: From March 2020 to Election Day, we removed over 265,000 pieces of Facebook and Instagram content in the United States for violating our voter interference policies.
- Partnering with third party fact checkers to help us identify election-related misinformation and remove content that violates our rules
- Provide more transparency about political advertising on the platform through our ad library so people can see who is behind those ads
- Require anyone who serves political and social ads on Facebook to prove their identity and be authorized to serve them
- Suspend new political and social advertisements for the seven days preceding the election and prevent any new political advertising from starting during this period, then completely suspend such advertisements between election day and nomination
- Creation of a unique voting information center to ensure people have reliable information about the election and how to vote. Once the results were obtained, the Voting Information Center promoted the accurate election results, and we kept them in place long after Election Day.
- Placing a notification at the top of Facebook and Instagram clearly stating that Joe Biden was the Protected Winner once a majority of independent decision-making bureaus in major media outlets, including ABC, CBS, Fox, NBC, CNN and AP, have it. called.
- Added tags on voting and election messages, including from politicians, with a link to the Voting Information Center so people get the latest updates on the vote count and the results. After the election, we also applied labels with the projected winner to all presidential candidate posts with a link to our Voting Information Center to learn more about the election results. And, when we learned about it, we added tags to the content that distorted the electoral or vote counting process, which included information from the Bipartisan Policy Center.
Putting in place a series of temporary measures on products when there were specific risks of spikes in activity on the platform could mean that the many systems we had in place to enforce our policies might fail. not be able to keep up. An example includes lmimic the distribution of live video that our systems may predict to be election related, and automatically remove potentially violating content at lower trust levels than we would normally have before they are even reviewed by our team . We took these stepsto respond to specific signals we were seeing on the platform, such as spikes in flagged content – and deactivated some of them responsibly and gradually as those signals returned to their previous levels. We also left a lot of them in place during the opening day.
While this last point is important, as you can see, it was only part of a much longer series of actions that we took long before, during and after Election Day. In preparing for the election, we anticipated multiple potential outcomes and considered many societal factors in understanding and responding to violence.
This is largely the reason why we developed these additional product levers for Extraordinary Circumstances, which we have internally referred to as âbreaking the glassâ measures. This is also why we have kept our full suite of systems in place, including numerous ‘glass breaking’ measures, long after Election Day and even after seeing specific signals regarding potential threats level off and more. a month has passed since major news organizations called the election of current President Joe Biden.
It is absurd to blame what happened on January 6 on how we implemented just one item from the list above. We’re a big social media platform, so it’s only natural for content about major events like this to appear on Facebook. But the responsibility for the insurgency itself lies entirely with the insurgents who broke the law and those who incited them. We worked with law enforcement in the days and weeks following January 6 to ensure that information linking those responsible for these crimes to their crimes was available. Of course, there are always lessons to be learned from the work we do to protect elections and respond to immediate threats and longer-term challenges. We will apply these lessons while continuing to do all of this work.
Facebook Inc. published this content on 22 October 2021 and is solely responsible for the information it contains. Distributed by Public, unedited and unmodified, on October 23, 2021 12:33:04 AM UTC.