The term cha (茶) is “Sinitic,” meaning it is common to many varieties of Chinese. It began in China and made its way through central Asia, eventually becoming “chay” (چای) in Persian. That is no doubt due to the trade routes of the Silk Road, along which, according to a recent discovery, tea was traded over 2,000 years ago. This form spread beyond Persia, becoming chay in Urdu, shay in Arabic, and chay in Russian, among others. It even it made its way to sub-Saharan Africa, where it became chai in Swahili. The Japanese and Korean terms for tea are also based on the Chinese cha, though those languages likely adopted the word even before its westward spread into Persian.
But that doesn’t account for “tea.” The Chinese character for tea, 茶, is pronounced differently by different varieties of Chinese, though it is written the same in them all. In today’s Mandarin, it is chá. But in the Min Nan variety of Chinese, spoken in the coastal province of Fujian, the character is pronounced te. The key word here is “coastal.”
The te form used in coastal-Chinese languages spread to Europe via the Dutch, who became the primary traders of tea between Europe and Asia in the 17th century, as explained in the World Atlas of Language Structures. The main Dutch ports in east Asia were in Fujian and Taiwan, both places where people used the te pronunciation. The Dutch East India Company’s expansive tea importation into Europe gave us the French thé, the German tee, and the English tea.
There’s a great map to demonstrate it visually.
Amid unceasing criticism of Facebook’s immense power and pernicious impact on society, its CEO, Mark Zuckerberg, announced Thursday that his “personal challenge” for 2018 will be “to focus on fixing these important issues”.
Zuckerberg’s new year’s resolution – a tradition for the executive who in previous years has pledged to learn Mandarin, run 365 miles, and read a book each week – is a remarkable acknowledgment of the terrible year Facebook has had.
“Facebook has a lot of work to do – whether it’s protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent,” Zuckerberg wrote on his Facebook page. “We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.”
At the beginning of 2017, as many liberals were grappling with Donald Trump’s election and the widening divisions in American society, Zuckerberg embarked on a series of trips to meet regular Americans in all 50 states. But while Zuckerberg was donning hard hats and riding tractors, an increasing number of critics both inside and outside of the tech industry were identifying Facebook as a key driver of many of society’s current ills.
The past year has seen the social media company try – and largely fail – to get a handle on the proliferation of misinformation on its platform; acknowledge that it enabled a Russian influence operation to influence the US presidential election; and concede that its products can damage users’ mental health.
By attempting to take on these complex problems as his annual personal challenge, Zuckerberg is, for the first time, setting himself a task that he is unlikely to achieve. With 2 billion users and a presence in almost every country, the company’s challenges are no longer bugs that can be addressed by engineering code.
Facebook, like other tech giants, has long maintained that it is essentially politically neutral – the company has “community standards” but no clearly articulated political orientation. While in past years, that neutrality has enabled Facebook to grow at great speed without assuming responsibility for how individuals or governments used its tools, the political tumult of recent years has made such a stance increasingly untenable.
The difficulty of Facebook’s task is illustrated in the company’s current conundrum over enforcing of US sanctions against some world leaders but not others, leaving observers to wonder what rules, if any, Facebook is actually playing by.
I know he’s come a long way since scoffing at the idea that Facebook had any problems at all but isn’t this the sort of thing a CEO is paid to do as a er regular job, without the melodrama of a “personal challenge”?
Glenn Greenwald on The Intercept:
FACEBOOK NOW SEEMS to be explicitly admitting that it also intends to follow the censorship orders of the U.S. government. Earlier this week, the company deleted the Facebook and Instagram accounts of Ramzan Kadyrov, the repressive, brutal, and authoritarian leader of the Chechen Republic, who had a combined 4 million followers on those accounts. To put it mildly, Kadyrov — who is given free rein to rule the province in exchange for ultimate loyalty to Moscow — is the opposite of a sympathetic figure: He has been credibly accused of a wide range of horrific human rights violations, from the imprisonment and torture of LGBTs to the kidnapping and killing of dissidents.
But none of that dilutes how disturbing and dangerous Facebook’s rationale for its deletion of his accounts is. A Facebook spokesperson told the New York Times that the company deleted these accounts not because Kadyrov is a mass murderer and tyrant, but that “Mr. Kadyrov’s accounts were deactivated because he had just been added to a United States sanctions list and that the company was legally obligated to act.”
As the Times notes, this rationale appears dubious or at least inconsistently applied: Others who are on the same sanctions list, such as Venezuelan President Nicolas Maduro, remain active on both Facebook and Instagram. But just consider the incredibly menacing implications of Facebook’s claims.
What this means is obvious: that the U.S. government — meaning, at the moment, the Trump administration — has the unilateral and unchecked power to force the removal of anyone it wants from Facebook and Instagram by simply including them on a sanctions list. Does anyone think this is a good outcome? Does anyone trust the Trump administration — or any other government — to compel social media platforms to delete and block anyone it wants to be silenced?
Facebook’s major effort to stop the spread of false articles on its platform did not result in less engagement for the most viral hoaxes in 2017, according to an analysis by BuzzFeed News. In fact, the analysis found that the 50 most viral fake news stories of 2017 generated more total shares, reactions, and comments than the top 50 hoaxes of 2016.
This highlights the challenge faced by Facebook to find ways to halt or arrest the spread of completely false stories on its platform, and raises questions about how much progress has been made in fighting this type of misinformation. After a year of working with fact-checkers such as PolitiFact and Snopes, and stating that related product efforts can reduce spread of a false news story by 80% once it’s been debunked, Facebook remains the home of massively viral hoaxes.
In response to the BuzzFeed News analysis, a Facebook spokesperson said the company’s initiatives continue to improve, and that the spread of pure fake news has been reduced on its platform.
Throughout 2017, as social-media-addled minds chased one shiny news object after another, one story was constant and inescapable: “Russia.”
But what “Russia” meant, exactly, depended a lot on where you got your news. The “Russia story” that developed throughout the year was likely very different from the “Russia story” as it was understood by your friends, relatives, or co-workers, not just because your politics might be different than theirs but because you might have been encountering different news entirely. To some Democrats, the “Russia Story” was that Donald Trump was a wholesale asset of the Russian government. To others, the “Russia Story” was the F.B.I.’s methodical investigation. And to many Republicans, the “Russia story” was actually a story about the Democrats’ collusion with Russia, not the Trump campaign’s.
We wanted to see how those competing narratives were shaped, week by week, story by story, through 2017 — examining not just broad “liberal” and “conservative” bubbles, but also the different ways highly partisan readers and their less-partisan neighbors might have encountered the story. Would outlets directed toward zealously partisan Democrats frame the story differently than those with readers more evenly distributed along the political spectrum? What about on the other side of the fence?
To re-create approximate social-media “bubbles,” we used “partisanship scores,” developed by Harvard’s Berkman Klein Center. Rather than attempt to arbitrarily measure the slant of a given publication’s editorial line, these scores gauge the partisanship of its audience, by measuring how frequently its stories were shared by Clinton or Trump supporters during the 2016 election. For our purposes, outlets shared vastly more often by Clinton supporters than Trump supporters are categorized as “Highly Democratic”; outlets where the ratio was closer but still weighted toward Clinton were labeled “Mostly Democratic”; and so on. These reconstructed bubbles aren’t a perfect representation of how people encountered their news, but the partisanship scores allow us to focus on publications’ actual audience — the real citizens of the bubble — rather than our perceptions of their politics.
Fascinating comparative demonstration of the real-world effect of different news sources in a digital age.
Under fire for Facebook Inc.’s role as a platform for political propaganda, co-founder Mark Zuckerberg has punched back, saying his mission is above partisanship. “We hope to give all people a voice and create a platform for all ideas,” Zuckerberg wrote in September after President Donald Trump accused Facebook of bias.
Zuckerberg’s social network is a politically agnostic tool for its more than 2 billion users, he has said. But Facebook, it turns out, is no bystander in global politics. What he hasn’t said is that his company actively works with political parties and leaders including those who use the platform to stifle opposition—sometimes with the aid of “troll armies” that spread misinformation and extremist ideologies.
The initiative is run by a little-known Facebook global government and politics team that’s neutral in that it works with nearly anyone seeking or securing power. The unit is led from Washington by Katie Harbath, a former Republican digital strategist who worked on former New York Mayor Rudy Giuliani’s 2008 presidential campaign. Since Facebook hired Harbath three years later, her team has traveled the globe helping political clients use the company’s powerful digital tools.
In some of the world’s biggest democracies—from India and Brazil to Germany and the U.K.—the unit’s employees have become de facto campaign workers. And once a candidate is elected, the company in some instances goes on to train government employees or provide technical assistance for live streams at official state events.
If you spend a lot of money with Facebook, of course they’ll help you spend it effectively. I’m not sure that Facebook has really thought through the implications of what they’re doing or the scope for their role going (or doing) wrong.
David Pemsel, CEO of Guardian Media Group, interviewed in Digiday UK:
What’s next for publishers’ relationship with Facebook and Google?
We have a close relationship with Google from [CEO] Sundar [Pichai] down. They recognize the role of quality news within their ecosystem. So we’ve collaborated a lot around video, VR funding, data analytics and engineering resources. It’s a valuable strategic relationship.
What about Facebook?
Facebook is a different picture. Our relationship with them is difficult because we’ve not found the strategic meeting point on which to collaborate. Eighteen months ago, they changed their algorithm, which showed their business model was derived on virality, not on the distribution of quality. We argue that quality, for societal reasons, as well as to derive ad revenue, should be part of their ecosystem. It’s not. We came out of Instant Articles because we didn’t want to provide our journalism in return for nothing. When you have algorithms that are fueling fake news and virality with no definition around what’s good or bad, how can the Guardian play a role within that ecosystem? The idea of what the Guardian does being starved of oxygen in those environments is not only damaging to our business model but damaging to everyone.
The YouTube Film School scene just keeps growing, in part because viewers love watching them. Want more Fincher knowledge? You can learn about how he made music videos from Patrick Willems, dig into the “invisible details” in his work with kaptainkristian, or get a 14-minute breakdown on lots of his tricks from The Film Guy. These videos offer so many new ways to watch movies, TV, or anything else. You might grow fascinated with how texting is represented on screen, learn why a focus-puller is so important, or discover how color tints can completely change the way you experience a film.
The YouTube Film School instills a new appreciation for why props matter, how foley art works (and what foley art is), and why a one-shot or a dolly zoom can hijack your brain and make you feel something. “You have these film-school people who are like, ‘Let me tell you what you’re feeling intuitively about what Spielberg is trying to do to your brain,’” says Jason Kottke, a prominent blogger and YouTube Film School fan whose posts turned me on to many of these creators. “There’s been film criticism as long as there’s been films,” he says, “but writing about film is a little like dancing about architecture. Video lends itself really well to looking at how these moviemakers put things together.” Whether you want to be a filmmaker yourself, or just want to understand more about why movies matter and what they do to your brain, it’s all right there on YouTube. Like and subscribe.
Given our discussion in the What is Cloud Computing? chapter, you might expect Netflix to serve video using AWS. Press play in a Netflix application and video stored in S3 would be streamed from S3, over the internet, directly to your device.
A completely sensible approach…for a much smaller service.
But that’s not how Netflix works at all. It’s far more complicated and interesting than you might imagine.
To see why let’s look at some impressive Netflix statistics for 2017.
- Netflix has more than 110 million subscribers.
- Netflix operates in more than 200 countries.
- Netflix has nearly $3 billion in revenue per quarter.
- Netflix adds more than 5 million new subscribers per quarter.
- Netflix plays more than 1 billion hours of video each week. As a comparison, YouTube streams 1 billion hours of video every day while Facebook streams 110 million hours of video every day.
- Netflix played 250 million hours of video on a single day in 2017.
- Netflix accounts for over 37% of peak internet traffic in the United States.
- Netflix plans to spend $7 billion on new content in 2018.
What have we learned?
Netflix is huge. They’re global, they have a lot of members, they play a lot of videos, and they have a lot of money.
As you’re slumped in front of the next auto-playing episode you probably aren’t thinking about how Netflix works but this article explains it.
BERLIN (Reuters) – Germany’s intelligence service has published the details of social network profiles which it says are fronts faked by Chinese intelligence to gather personal information about German officials and politicians.
The BfV domestic intelligence service took the unusual step of naming individual profiles it says are fake and fake organizations to warn public officials about the risk of leaking valuable personal information via social media.