Become a Votebeat sponsor

The problems with Facebook’s attempts to manage election information

A profile view of Mark Zuckerberg wearing a suit in a dimly lit hallway.
Facebook co-founder and CEO Mark Zuckerberg arrives for a Capitol Hill hearing in October 2019 (Win McNamee / Getty Images)

This week’s bombshell by the Washington Post paints a stark picture of how haphazardly the company chooses to distribute election information. CEO Mark Zuckerberg reportedly expressed hesitation over distributing voting information on WhatsApp — which is owned by Facebook — in Spanish, saying it would appear partisan to do so.

I mean, come on. Most states are required to distribute voting information in Spanish, making this less a partisan issue and more a basic need for a country whose voters speak many languages. The Post report explains that the idea, which originated with a team of WhatsApp employees, was to translate content from Facebook’s “voting information center” into Spanish and push out the information on Whatsapp “proactively through a chat bot or embedded link to millions of marginalized voters.”

In an unprompted tweet to me, one of Facebook’s communications staff said the article was incorrect, adding that “WhatsApp DID launch a campaign in Spanish and English, with partners at @votedotorg and @factchecknet, to encourage voter registration and raise awareness about misinformation.” The Post acknowledged that initiative and described it as a “whittled-down version” of the original idea. The spokesman did not respond to my question about whether Zuckerberg did indeed express such a reservation, or whether that reservation impacted the speed or breadth of the information given to Spanish language users of the app.

It is not the point that Facebook eventually did launch a campaign with some information in Spanish  —  what really matters is that the CEO of a major American company that plays a direct role in distributing information about voting believes that basic voting information in more than one language is inherently partisan. The partisan gamesmanship that Republicans have been playing for years with Spanish-language voters has sunk in, and sunk in with the people responsible for making gigantic choices about access to information. Zuckerberg has been pulled into the fray, potentially without even realizing the inherent racism of such a mindset. 

Even though Spanish-speaking voters are not politically homogeneous, they have long been a target of the far-right’s campaign to convince Americans that voter fraud is pervasive. As one Texas representative told me years ago, “If you persuade people that you are the party trying to make sure elections are controlled by American citizens and that Democrats are doing everything they can to make sure that illegal immigrants can vote by the busload, that’s a good position to be in.” 

What is key here is that Facebook volunteered to become a place for voting information — no one really asked them to, and lots of election administrators told them not to, given how fraught with misunderstandings such displays can be. Still, Facebook repeatedly presented their accurate voting information as a counterweight to the massive amounts of disinformation that objectively made our election crisis worse in this country — as good PR amid a PR disaster. And even that accurate information may have been compromised because the CEO of the company is more concerned about how information in Spanish would be received by English-language speakers rather than the ways it would help Spanish-speaking voters. 

It is worth noting that WhatsApp became a major player of election disinformation, specifically targeted at Spanish-speaking users of the platform. Spanish-language WhatsApp groups were flooded in 2020 with false information about fraud and Biden’s victory, circulating in private messages that made the information hard to track and combat. 

It is not remotely plausible for Zuckerberg to claim he’d not considered the impact of accurate voting information to these users, who represent a significant chunk of the WhatsApp user base — almost half of Hispanics in the United States use the messaging app, more than any other ethnic group. Nor is it remotely plausible for Facebook to claim that the Washington Post simply made this up. The reporters relied on internal company documents for their reporting, and neither of the two spokespeople I have asked for comment have disputed that Zuckerberg said this.

Facebook is under a deluge of journalistic criticism right now, courtesy of thousands of leaked documents provided to 17 news outlets by former employee Frances Haugen. In its responses, the company has said it does not put profits over people. “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie,” a company spokesperson said in a statement.

It’s not clear whether Facebook delayed the Spanish information in a meaningful way, or whether a larger campaign was minimized because of his hesitations around the perception of language offerings. So we don’t know — and probably won’t, barring discovery or additional leaked information — if Facebook’s assertion that it doesn’t put the well being of people over profit is true. The available evidence just doesn’t back that up. 

Let’s take stock of Facebook’s various election fumblings, acknowledging not even this list paints a full picture of the platform’s problems in this space:

  • Facebook’s own oversight board has long derided the company for failing to have a firm policy for dealing with non-violent speech that tends to — and we’ve seen this first hand over the last four years — lead to actual violence. Politico has a piece on how this failure to act, along with leadership delays and general company bumbling, slowed down the response to the avalanche of misinformation ahead of the Jan. 6 insurrection. 
  • Facebook has largely refused to answer specific questions about its policies for preventing voting misinformation, and has refused to share crucial data that would help make sense of the spread of this information with researchers and even the White House. The company even locked researchers who study misinformation out of the platform last year. 
  • It was clear even before the 2020 election that Facebook’s plan to have users flag misinformation was going to be ineffective, and they carried on with it anyway. Troll farms ultimately reached 140 million Americans ahead of the election. This ineffective system worked even less well when it came to the chief spreader of fake news: Donald Trump. Despite the increasingly fact-free nature of Trump’s claims on election integrity, multiple news organizations and researchers found that Facebook treated Donald Trump’s posts with kid gloves, even when they contained information that violated the site’s user agreement.
  • Far-right election misinformation continues to thrive on the platform, despite all of Facebook’s breathless reassurance that they are addressing the problem. Misinformation gets 600 percent more engagement than real information, researchers say. Apparently the breadth of this issue only became apparent to the company, which announced this month that they were “considering” putting together an advisory panel of experts on election misinformation. Good timing, guys! 
  • The Washington Post reports that Facebook was aware that politicians could and likely would use paid advertising on the site to spread misinformation, and decided to accept the risk. Even internally, the company said the risk ads would be used in this way was “high.” The company continues to refuse to fact-check the postings of politicians as a rule. 
  • Facebook acted so late to combat misinformation in 2020 that one nonprofit group found that the company failed to prevent more than 10 billion page views of false information that year. That’s a stunning thing, given that Facebook has literally known for years that its platform is extremely influential at voter turnout and suppression. 
  • The company put together a team that significantly improved the site’s response to foreign misinformation and other concerns from 2016, but then disbanded the group the very day after the 2020 election. COO Sheryl Sandberg — without looking at any specific data to base this conclusion — said that the Jan. 6 insurrection was “largely” planned on other social media platforms. Essentially, they declared victory. The whistleblower documents covered in recent days underscores how premature and false that statement was. 

Facebook may not prefer to think of itself as anything other than a platform for expression, but it holds a disproportionate amount of power in shaping its users’ views and politics. It is possible, tilting towards probable, that the company doesn’t understand how much power it truly has, much less how to manage that power responsibly.

One thing that might help is separating anyone who has an interest in the company’s bottom line — primarily, Zuckerberg — from making decisions about how the platforms control misinformation and disseminate accurate information. That’s a distinction many journalism organizations strive to make, because of their belief in serving the public interest. Maybe it’s time for Facebook to see itself as the powerful media company that it is. 

The Latest

An effort to reduce errors — and prevent ballots from being disqualified — backfires as many voters fail to write in the last two digits of the year.

Inquiry into Steven Frid began in the fall after an anonymous complaint and led to his ouster from the Election Assistance Commission.

An expanded post-election hand count and an added $14,000 cost for special ballot paper won’t make voting more secure, but they may appease skeptics — for now.

While America waits for results, Runbeck’s machines scan tens of thousands of envelopes for the nation’s largest swing county.

Trump says Republicans need to pay more attention to the issue. Parties have been saying this throughout American history.

A Votebeat-Spotlight PA review finds sloppy record keeping in tests on voting machines. Lawmakers are taking note.