Facebook doesn’t need to ban fake news to fight it

The Guardian:

If you walk into a newsagent, and pick up a copy of the Sunday Sport (American readers, think the National Enquirer but with a lower proportion of true stories), you have a number of contextual clues that suggest a story with the headline “Ed Miliband’s Dad Killed My Kitten” might not be entirely true. The prominent soft porn and chatline adverts; the placement alongside other stories like “Bus found buried at south pole” and “World War 2 Bomber Found on Moon”; and the fact that the paper is in its 30th year of publishing, letting readers build up a consistent view about the title based on previous experience.

If a friend shares that same article on Facebook, something very different happens. The story is ripped from its context, and presented as a standard Facebook post. At the top, most prominently, is the name and photo of the person you know in real life who is sharing the piece. That gives the article the tacit support and backing of someone you really know, which makes it far more likely to slip past your bullshit detector.

Next, Facebook pulls the top image, headline, and normally an introductory paragraph, and formats it in its own style: the calming blue text, the standard system font, and the picture cropped down to a standard aspect ratio. Sometimes, that content will be enough for a canny reader to realise something is up: poor spelling, bad photoshopping, or plain nonsensical stories, can’t be massaged away by Facebook’s design sense.

Nonetheless, the fact that every link on Facebook is presented in the same way serves the average out the credibility of all the posts on the site. The Sunday Sport’s credibility gets a boost, while the Guardian’s gets a drop: after all, everyone knows you can’t trust everything you read on Facebook.

Then, at the very bottom of the shared story, in small grey text, is the actual source. It’s not prominent, and because it’s simply the main section of a URL, it’s very easy to miss hoaxes. Are you sure you could spot the difference between ABC.GO.COM, the American broadcaster’s website, and ABC.CO.COM, a domain that was briefly used to spread a hoax story about Obama overturning the results of the election?

Then below all of that, are three further buttons: like, share and comment. All three help spread the story, whether you support it or not, because Facebook’s algorithm views engagement with a post as a reason for showing it to more people. And while all three get a button to themselves, nowhere does Facebook provide a similar call to action for the most important response of all: clicking through, and reading the whole story in its original context.

For that, you’ll have to scroll back up – but by then, you’ve already moved on to the next article on your newsfeed. And even if you reacted with scepticism when you first read the headline, as time goes by, your initial reaction gets lost, and eventually it becomes one of those things you “just know”.

It’s not an accident that Facebook is designed this way. The company extensively tests its site, to ensure its layout is fully optimised for pursuing its goals.

Unfortunately, Facebook doesn’t A/B test its site for public goods like “functioning media ecosystem” or “supporting extremist politicians”. Instead, the company’s goals are to maximise time spent on site, to try and make sure readers come back every day and continue to share posts, engage with content, and, ultimately, click on the adverts that have made the social network the fifth largest company in the world by market cap.

So, here’s what Facebook could do to help deal not with fake news, but with the negative effects it has on our society: de-emphasise who shared a story into your timeline, instead branding it with the logo and name of the publication itself, and encourage readers to, well, read, before or instead of liking, sharing and commenting.

Doing so might not be great for Facebook’s bottom line, of course. The site would be less “sticky”, users would be more likely to click away and not come back, and the amount of sharing would drop. But maybe it’s time for Zuckerberg to take one for the team.

As previously noted,  never attribute to incompetence that which can be explained by differing incentive structures.