Facebook won’t take down Trump ad with false claims about Joe Biden
President Donald Trump’s reelection campaign is running a false ad about former Vice President Joe Biden on Facebook, and there’s nothing the company is going to do about it. It doesn’t bode well for what’s to come in 2020 and social media platforms’ willingness to act on the spread of misinformation.
The Trump campaign has released a 30-second video ad accusing the former vice president of promising Ukraine money for firing a prosecutor investigating a company with ties to Biden’s son, Hunter Biden — essentially, the false conspiracy at the center of the impeachment inquiry President Trump is now facing. CNN refused to air the ad because there is no evidence for the claims it is making. But not Facebook — or multiple other tech platforms and media outlets like YouTube, Twitter, MSNBC, and Fox, for that matter. Instead, the Menlo Park, California-based company is going to let it stay — and rack up millions of views in the process.
The Biden campaign asked Facebook to take the ad down, but the platform said it’s a no-go. The New York Times first obtained a letter from Facebook to Biden’s camp on Tuesday responding to its request to remove the video, explaining that despite its false claims, it doesn’t actually violate any of Facebook’s policies.
“Our approach is grounded in Facebook’s fundamental belief in free expression, respect for the democratic process, and the belief that, in mature democracies with a free press, political speech is already arguably the most scrutinized speech there is,” Facebook’s head of global elections policy, Katie Harbath, wrote in the letter.
Biden’s camp slammed the decision. In a statement, spokesperson TJ Ducklo called Facebook’s decision “unacceptable” and said that allowing the video to spread “poisons the public discourse and chips away at” democracy. “Whether it originates from the Kremlin or Trump Tower, these lies and conspiracy theories threaten to undermine the integrity of our elections in America,” he said.
Sen. Elizabeth Warren, who is running against Biden for the 2020 Democratic nomination, earlier this week criticized Facebook over its policies related to the veracity of claims made in political ads.
The episode portends an ominous future for political advertising heading into the 2020 election and what politicians will and won’t be allowed to claim about their opponents online. The Senate Intelligence Committee just released the second volume of its investigation into influence operations on the 2016 election and warned that there is more to come. Social media companies seem hell-bent on taking a hands-off approach to policing political advertising, even if it includes straight lies.
These are the ones that have not banned it:
— Dylan Byers (@DylanByers) October 9, 2019
A spokesperson for Facebook in an email explained that under the company’s policies, politicians are ineligible from its fact-checking program. It’s not a practice that’s unique to Facebook. As Recode’s Teddy Schleifer recently outlined, social media companies’ policies, broadly, are that politicians can pretty much say whatever they want online, because it counts as news:
Although platforms say they will enforce their rules against politicians if they must, they will continue to be far more permissive places for candidates than they are for regular posters — who they also continue to struggle to effectively moderate.
Facebook in its letter to the Biden campaign, which Vox also obtained, outlined many of its policies around political advertising and content moderation. It highlighted its “standard for transparency” around Facebook pages and political ads so that people can see who is running them (though, of course, that doesn’t really solve the false information problem). And the company noted that during the 2018 midterms, it rejected one of Trump’s ads for violating its “sensational content” policies.
But Facebook is also making clear, time and time again, that it doesn’t have any intention of making sure the content on its platform is true. In July 2018, Facebook CEO Mark Zuckerberg raised eyebrows when he said in an interview with Recode’s Kara Swisher that he didn’t believe the platform should take down content that denies the Holocaust happened, even though he finds it “deeply offensive.” And in May, the company refused to take down a video that was doctored to make House Speaker Nancy Pelosi appear drunk. After coming under fire for the decision, Facebook demoted the video and made it harder to find, but it didn’t take it down entirely.
Harbath, in her letter to the Biden campaign, explained some of the company’s reasoning:
If a politician seeks to share a viral hoax — like a link to an article or a video or photo, that has been previously debunked, we will demote that content, display related information from fact-checkers, and reject its inclusion in advertisements. That is different from a politician’s own claim or statement — even if the substance of that claim has been debunked elsewhere. If the claim is made directly by a politician on their Page, in an ad or on their website, it is considered direct speech and ineligible for our third party fact checking program.
Facebook’s reaction to its role in spreading misinformation and influencing politics has consistently disappointed, and this latest episode with the Biden campaign signals the issue is likely to persist.