A War of Words Puts Facebook at the Center of Myanmar’s Rohingya Crisis

New York Times:

Facebook has no office in Myanmar, but the company has worked with local partners to introduce a Burmese-language illustrated copy of its platform standards and will “continue to refine” its practices, said a spokeswoman, Clare Wareing, in an emailed statement.

Human rights groups say the company’s approach has allowed opinion, facts and misinformation to mingle on Facebook, clouding perceptions of truth and propaganda in a country where mobile technology has been widely adopted only in the past three years.

Under the rule of the military junta, strict censorship regulations deliberately made SIM cards for cellphones unaffordable to control the free flow of information. In 2014, restrictions loosened and the use of mobile technology exploded as SIM cards became affordable. Facebook users ballooned from about two million in 2014 to more than 30 million today [2017]. But most users do not know how to navigate the wider internet.

In the meantime, Facebook has become a breeding ground for hate speech and virulent posts about the Rohingya. And because of Facebook’s design, posts that are shared and liked more frequently get more prominent placement in feeds, favoring highly partisan content in timelines.

My emphasis.

22 November 2017: As the Columbia Journalism Review notes, In some countries, fake news on Facebook is a matter of life and death:

Larson says there’s a debate to be had about how to define hate speech, “but what I would consider dangerous speech is advocating that the Rohingya need to leave Myanmar, and sharing doctored images of them supposedly burning their own houses to create a media spectacle.”

In a way, she says, these images—which were liked and shared tens of thousands of times—”gave cover for military action and human rights violations, including violence and rape. You can’t say social media kills people. . . but certainly social media shaped public opinion in a way that seems to have played a part in the escalation of violence against the Rohingya.”

Facebook’s approach to countries like Myanmar and others in the region often strikes those on the ground as not just out of touch but actively cavalier. In recent experiments, for example, users in countries like Cambodia and Slovakia had news articles moved to a completely separate feed, which local nonprofit groups and media outlets say significantly impacted their ability to reach people with crucial information.

It’s one thing to tread carefully around issues like free speech, Larson says, “but if you’re going to run A/B testing, where you change an algorithm and see what you think consumers like best, for god’s sake, stick to stable democracies. Don’t pick a place where there’s an authoritarian regime that is busy locking up opposition leaders, and Facebook is a primary way that activists communicate about their government.”

In many ways, Myanmar is an example of the future Mark Zuckerberg seems to want: A country in which most people are connected through the social network and get virtually all of their news from it. And yet, the outcome of that vision isn’t a utopia, it’s a dystopia—a world where ethnic and cultural tensions are inflamed and weaponized. And Facebook’s response looks inadequate to the dangers it has helped unleash.