As pundits debrief the election they completely failed to predict, and much of the general population reels in dismay, searching for sources of proactive resistance, many in the media are now staring in the mirror, wondering how to fix the growing problem of Facebook. At 1.18 billion users, Facebook is now the de facto most powerful news service in the world—and as we have discussed before, it is chock-full of horseshit. A week ago, the discussion centered around this as a “post-fact” election, a frankly insane notion backed up by a Pew Research Center report that 81 percent of partisans disagree about “basic facts.” A BuzzFeed report from earlier this year investigated the Facebook pages seen by both the left and the right, and found that patently false stories comprised 20 percent and 38 percent of their respective feeds. This problem was only exacerbated in August, when Facebook fired its small team of internal editorial staffers, who were charged with filtering those false stories from trending on the News Feed.
This misinformation is reinforced by Facebook’s programmatic sorting of people by political leaning. Once Facebook determines your political beliefs—which it is doing, whether you asked it to or not—it aims to prevent any dissenting information from appearing within your bubble.
In response to all this, The New York Times determined prior to the election that “the cure for fake journalism is an overwhelming dose of good journalism.” At New York magazine, Max Read calls this decree “inspiring” but not pragmatic, given that much of the problem here is sheer volume. People demanded “news” that verified the things they already believed, and so a flood of sites have sprung up to supply that (and make money off the ensuing traffic). It’s that very flood that drowns out any “dose of good journalism.”
It seems to me that the problem we face is not a lack of journalism, good or bad, but an overwhelming abundance of it. Fake-news attacks discourse in structurally similar ways to the DDoS attacks that recently crippled internet infrastructure for a day: Hoaxes overwhelm political conversation (facts, ideas, stories) with junk, aware that the rules of the system (in this case, freedom of speech) prevent it from distinguishing from “legitimate” and “illegitimate,” and therefore from stopping the attack. An overwhelming dose of good journalism, rather than addressing or rebutting lies and hoaxes, would simply add to the cacophony; presented identically on Facebook alongside fake journalism, it would merely appear as another opinion in a swarm of them.
At Mashable, Damon Beres contextualizes this as not just a problem of Facebook’s intent (What do they have to gain here?), but of its very design. If intentional misinformation is presented to users in the exact same white square as a deeply reported, fact-checked article by a reputable news source, how is the user to tell the difference?
The problem is complicated, but not unsolvable. Beres offers these suggestions:
Facebook should shave the little calloused “Trending” nub, which does nothing for no one, and then get to work reshaping how people share media on the platform.… There are potential “outs” for media to actually work well on Facebook. Maybe customized branding becomes more prominent. Maybe Facebook even offers a way for publishers to charge for content — paid magazines, at least, were made to suggest premium quality. Maybe it re-tools its algorithms to get smarter about legit content, or uses fact-checking software or even (re-)hires human journalists.
Bad information is part of information; noise is part of a signal. The media can get better at qualifying its real news, and the public can get better at deciphering it. A few years ago, On The Media produced a Breaking News Consumer Handbook, a series of nine guidelines for retrieving trustworthy information in an emergency situation. Several of them—“don’t trust anonymous sources,” “compare multiple sources”—could just as easily translate to a new Social Media Consumer Handbook, perhaps this time with the addendum, “Don’t read something your racist aunt posts,” and, “If the website looks like it was designed in Microsoft Word, it probably was.”