Facebook news feeds

From Ars Technica:

As the protests in Ferguson, Missouri over police fatally shooting 19-year-old Mike Brown have raged through the past several nights, more than a few people have noticed how relatively quiet Facebook news feeds have been on the matter. While #Ferguson is a trending hashtag, Zeynep Tufekci pointed out at Medium that news about the violence was, as best, slow to percolate through her own feed, despite people posting liberally about it.

While I’ve been seeing the same political trending tags, my feed is mundane as usual: a couple is expecting a baby. A recreational softball team won a league championship. A few broader feel-good posts about actor Chris Pratt’s ice-bucket challenge to raise awareness and money for ALS, another friend’s ice-bucket challenge, another friend’s ice-bucket challenge… in fact, way more about ice bucket challenges than Ferguson or any other news-making event. In my news feed organized by top stories over the last day, I get one post about Ferguson. If I set it to organize by “most recent,” there are five posts in the last five hours.

Zach Seward of Quartz noted, also anecdotally, that Facebook seems more likely to show videos of of people dumping cold water on their heads in high summer than police officers shooting tear gas at protesters and members of the media. And rightfully so in Facebook’s warped version of reality: people on Facebook may not be so interested in seeing the latter. At least, not if Facebook can’t show them the right angle. But Facebook’s algorithmic approach and the involvement of content sources is starting to come together such that it may soon be able to do exactly that.

Facebook’s controversial news feed manipulation study revealed, on a very small scale, that showing users more positive content encourages them to create positive content, resulting in a happier, reassuring Facebook experience. Showing them negative content leads to them creating more negative content, resulting in a negative feedback loop.

A second, earlier study from independent researchers in January looked at how political content and debate affects users’ perception of Facebook, their friends, and their use of Facebook. The study found that, because Facebook friend networks are often composed of “weak ties” where the threshold for friending someone is low, users were often negatively surprised to see their acquaintances express political opinions different from their own. This felt alienating and, overall, made everyone less likely to speak up on political matters (and therefore, create content for Facebook).