U.S. Senator Alex Padilla (D-Calif.), a member of the Senate Rules Committee and the Senate Homeland Security and Governmental Affairs Committee, joined Senator Amy Klobuchar (D-Minn.) and 11 of their colleagues in a letter to Meta CEO Mark Zuckerberg expressing concern regarding Meta’s response to the rise of online election-related misinformation and disinformation on its platforms, including Facebook.
Padilla was Thursday’s KVML “Newsmaker of the Day”.
The letter follows reports and Senate testimony indicating that the company prematurely terminated misinformation and disinformation safeguards that were put in place in advance of the 2020 election. This action allowed misinformation, disinformation, and violent rhetoric to return to Facebook immediately following Election Day and in the lead-up to the January 6th insurrection.
“While efforts to delegitimize election results and undermine our democracy continued and even intensified following Election Day, reports indicate that Facebook turned off election-related safeguards because the company was concerned that they could be limiting the growth of the platform,” the senators wrote. “The controls demonstrate that Facebook clearly knew that its platform could be used to sow and promote discord, division, and incendiary content.
The senators ask Meta to justify its decision to dial back post-election controls to curb disinformation and violent rhetoric and to explain its current work to guard against disinformation and violence on its platforms.
Padilla and Klobuchar were joined on the letter by Senators Jack Reed (D-R.I.), Mazie Hirono (D-Hawaii), Sherrod Brown (D-Ohio), Jeff Merkley (D-Ore.), Cory Booker (D-N.J.), Richard Blumenthal (D-Conn.), Sheldon Whitehouse (D-R.I.), Tammy Baldwin (D-Wis.), Patrick Leahy (D-Vt.), Mark Warner (D-Va.), and Dick Durbin (D-Ill.).
Here is the full text of the letter:
Dear Mr. Zuckerberg:
We write to express concern regarding Meta’s response to the rise of online election-related misinformation and disinformation in the United States and the accompanying rise in divisive, hateful, and violent rhetoric that undermines confidence in the integrity of our elections. The false claim that the 2020 presidential election was stolen fueled a violent and deadly insurrection at the U.S. Capitol on January 6th. The misinformation and disinformation that led to insurrection as well as planning for the insurrection took place largely on online platforms, including Facebook.
In particular, recent reports based on documents released by Facebook whistleblower Frances Haugen as well as Ms. Haugen’s testimony before the Senate Commerce Committee indicate that the company prematurely terminated misinformation and disinformation safeguards that were put in place in advance of the 2020 election. This action allowed misinformation, disinformation, and violent rhetoric to return to the platform immediately following Election Day and in the lead-up to the January 6th insurrection.
According to those documents, safeguards implemented by Facebook during the run-up to the 2020 election included measures to ban or remove hateful or violent content, prevent the growth and spread of groups or content that “delegitimized” the 2020 election, and reduce the spread of hate speech. Even with these measures in place, reports state that “nearly a quarter of Facebook users reported seeing hate speech ahead of the election and that more than half reported seeing content that made them wary of discussing political issues in public.” While efforts to delegitimize election results and undermine our democracy continued and even intensified following Election Day, reports indicate that Facebook turned off election-related safeguards because the company was concerned that they could be limiting the growth of the platform.
The controls demonstrate that Facebook clearly knew that its platform could be used to sow and promote discord, division, and incendiary content. Facebook also took some steps after the election to fight disinformation, including banning a group called “Stop the Steal” that promoted the lie that the 2020 election had been stolen. Still, Facebook began dialing back its misinformation and disinformation safeguards shortly after the election. The company also disbanded its Civic Integrity Team, which had been formed to combat misinformation and disinformation on Facebook, and distributed its members to other parts of the company. Since January 6th, Facebook has disavowed any responsibility for the insurrection and declined to implement a recommendation from its own Oversight Board to conduct an internal study of the platform’s role in the insurrection.
Other groups tied to the insurrection were also able to overcome the limited controls Facebook had left in place. Even after Facebook ultimately banned “Stop the Steal” on November 5 – when it already had more than 350,000 members – the false claim that the election had been stolen thrived on the platform. According to reporting based on the whistleblower documents, other groups promoting the false claim that the election had been stolen proliferated, successfully evading Facebook’s controls, stoking the anger that led to the January 6th insurrection.
At the same time, the spread of misinformation and disinformation about the election resulted in an unprecedented rise of violent threats against election officials, workers, and volunteers. Based on election disinformation, Facebook users sent hate speech, death threats, and bomb threats to those responsible for administering elections. According to a recent study, one in three local election officials now feel unsafe because of threats made to other election officials for doing their jobs.
In light of these reports and in order protect the integrity of our elections from misinformation, disinformation, and threats of violence, we request that you respond to the following questions by January 7, 2022:
Why did Facebook disable controls after the election – including algorithmic controls to help stop the spread of disinformation and controls to limit the growth of groups that spread disinformation about the election results?
Why did Facebook disband its Civic Integrity Team as a standalone unit and disburse its employees to other teams? When was the decision to disband the team made and who made that decision?
What department or division of Meta is currently responsible for overseeing efforts to prevent the spread of election-related misinformation, disinformation, and violent rhetoric for Meta-owned platforms?
What steps is Meta taking to ensure that Facebook users, like the organizers of the January 6th insurrection, cannot evade the company’s safeguards to continue promoting false claims about elections?
What steps is Meta taking now to protect the integrity of future elections from the spread of misinformation and disinformation, as well as to address violent threats against election officials and workers?
While we acknowledge the efforts Facebook took to prevent the spread of election-related misinformation and disinformation and hateful rhetoric on the platform prior to the 2020 elections, it clearly was not enough to prevent lies about the election from taking root and fueling violence against our democracy. We look forward to hearing more about how Meta will do better for its users and for our democracy.
This post was last modified on 01/06/2022 7:08 am
Written by Mark Truppner.
Sign up for our Breaking News Alerts and the myMotherLode.com Daily Newsletters by clicking here. Report breaking news, traffic or weather to our News Hotline (209) 532-6397. Send Mother Lode News Story photos to news@clarkebroadcasting.com.