You Don’t Want Facebook Deciding What’s True

Last week, to prove a point, Elizabeth Warren ran an ad on Facebook that lies about Mark Zuckerberg:

Ostensibly, this is an argument for Facebook to stand up for truth. In practice, it’s not entirely clear what Warren’s demanding or whether she truly wants Facebook to try to deliver it.

Warren says Facebook has “incredible power to affect our elections and our national debate,” and calls on the company to use it more. That might make sense if she thought Facebook were a good and responsible company that would use its power wisely, but she doesn’t. It might make sense if Facebook didn’t have a vested interest in the outcome of elections, but when Facebook itself is a campaign issue, they do. It might make sense if this were a straightforward task that an organization of good faith could do to the satisfaction of reasonable people, but it isn’t.

Warren’s arguments against Facebook extend far beyond this policy. When she laid out her plans to break up big tech — including Facebook — misinformation was only one of her complaints, and even then she focused on Russian interference rather than campaign lies. Warren argues that Facebook’s allowing false information in ads due to pressure from the Trump administration and pursuit of profit. Given these incentives, why should we expect the country to benefit if the company took a more active role?

Facebook has made it clear that they cannot afford to alienate conservatives, and any policy they implement is going to factor that in. If they took a more active role, they’d be refereeing between two sides, one of which might nominate a candidate who believes they should not exist in their current form. Here’s how Zuckerberg put it a private Facebook meeting:

“If [Warren] gets elected president, then I would bet that we will have a legal challenge, and I would bet that we will win the legal challenge. And does that still suck for us? Yeah. I mean, I don’t want to have a major lawsuit against our own government. … But look, at the end of the day, if someone’s going to try to threaten something that existential, you go to the mat and fight.”

Warren thinks Facebook’s fact-neutrality effectively puts the company’s thumb on the scale. And she thinks they’re doing it under pressure from Republicans, though that is almost entirely speculative — exactly the sort of comment that might get blocked by a platform that policed misinformation. But if it’s true, that’s even more reason to be suspicious of what they would do with active moderation rather than the alleged strategic absence of it.

Warren argues that Facebook is an untrustworthy company with too much power and she argues Facebook should use that power more. That’s not a contradiction, but neither is it very wise. Becoming arbiters of political truth would require flexing the power that Warren believes they should not have. Any moderation mistakes Facebook makes in the run-up to an election could have serious consequences.

Perhaps Facebook could rely on others instead of arbitrating truth themselves. But who? Facebook has to pick the fact checkers, and liberals have already complained about Facebook relying on conservative fact-checkers for posts other than political ads. And no matter who they use, getting it right is extremely difficult.

This is not a question of truth being subjective or one man’s fact being another man’s opinion. Facts exist, and things that run contrary to them are falsehoods. Sometimes it’s unclear, but even in those cases there are facts underlying the questions and statements. Language, on the other hand, is far more ambiguous.

Language has at least three parts: A literal interpretation of the words, what the speaker means by them, and what the listener hears from them. The first part can be cut-and-dry, but can also get bogged down in semantics and pedantry because political speech is often more poetry than precision. The latter two can be hard to pin down even when the fact-checkers are acting in good faith.

Consider Elizabeth Warren’s “false” advertisement. Even if we ignore the tacit admission in the third paragraph that the first paragraph is not true, it’s not clear that the original claim is substantively false. The word “endorse” literally means a public declaration of support, but support through action — such as changing advertising policy on their behalf — could be reasonably called endorsement.

Warren’s ad outlines what Zuckerberg is doing to help Trump, which means it would pass through most misinformation filters. Any that blocked it would likely (and rightly) face accusations of being overly literal. Elizabeth Warren’s sincere comments about the police “murder” of Michael Brown — literally inaccurate, since the officer was not convicted of murder — would be allowed under the same standard.

This is a difficult thing even for organizations of good faith to get right. And it can’t be elided simply by suggesting that only the most egregious falsehoods should be excluded. PolitiFact annually names one claim (or series of related claims) as its Political Lie of the Year. Even singling out what it believes is the worst instance leaves a lot of questionable cases and few people satisfied.

Many of PolitiFact’s lies of the year involve Obamacare. But conservatives can credibly argue that “a government takeover of healthcare” (2010’s winner) means different things depending on context, while liberals can credibly argue that Republicans did in fact vote “to end Medicare” (2011’s winner), by which they mean Republicans voted to get rid of Medicare as it exists today and replace it with a fundamentally different program under the same name. Even if there are no disagreements of fact, there are still disagreements of meaning and terminology.

PolitiFact has neither the financial nor political motivations of Facebook. No presidential candidate called for breaking up PolitiFact, and PolitiFact doesn’t fear big losses if they alienate conservatives. If PolitiFact cannot do a satisfactory job, on what basis do we believe Facebook could do better?

Warren has much more faith than I do in Facebook’s ability to do the right thing if they assert editorial influence. But even if they do the right thing now — whatever that is — that doesn’t mean they always will, which is why giving them even more power over information is risky.

Facebook’s reluctance to arbitrate campaign speech is better for everyone involved, especially its critics.

Source link