Category: Uncategorized

Human speech may have a universal transmission rate: 39 bits per second

Science:

Italians are some of the fastest speakers on the planet, chattering at up to nine syllables per second. Many Germans, on the other hand, are slow enunciators, delivering five to six syllables in the same amount of time. Yet in any given minute, Italians and Germans convey roughly the same amount of information, according to a new study. Indeed, no matter how fast or slowly languages are spoken, they tend to transmit information at about the same rate: 39 bits per second, about twice the speed of Morse code.

“This is pretty solid stuff,” says Bart de Boer, an evolutionary linguist who studies speech production at the Free University of Brussels, but was not involved in the work. Language lovers have long suspected that information-heavy languages—those that pack more information about tense, gender, and speaker into smaller units, for example—move slowly to make up for their density of information, he says, whereas information-light languages such as Italian can gallop along at a much faster pace. But until now, no one had the data to prove it.

Scientists started with written texts from 17 languages, including English, Italian, Japanese, and Vietnamese. They calculated the information density of each language in bits—the same unit that describes how quickly your cellphone, laptop, or computer modem transmits information. They found that Japanese, which has only 643 syllables, had an information density of about 5 bits per syllable, whereas English, with its 6949 syllables, had a density of just over 7 bits per syllable. Vietnamese, with its complex system of six tones (each of which can further differentiate a syllable), topped the charts at 8 bits per syllable.

Next, the researchers spent 3 years recruiting and recording 10 speakers—five men and five women—from 14 of their 17 languages. (They used previous recordings for the other three languages.) Each participant read aloud 15 identical passages that had been translated into their mother tongue. After noting how long the speakers took to get through their readings, the researchers calculated an average speech rate per language, measured in syllables/second.

Some languages were clearly faster than others: no surprise there. But when the researchers took their final step—multiplying this rate by the bit rate to find out how much information moved per second—they were shocked by the consistency of their results. No matter how fast or slow, how simple or complex, each language gravitated toward an average rate of 39.15 bits per second, they report today in Science Advances. In comparison, the world’s first computer modem (which came out in 1959) had a transfer rate of 110 bits per second, and the average home internet connection today has a transfer rate of 100 megabits per second (or 100 million bits).

How social networks can be used to bias votes

Via Charles Arthur, Nature editorial board:

Politicians’ efforts to gerrymander — redraw electoral-constituency boundaries to favour one party — often hit the news. But, as a paper published in Nature this week shows, gerrymandering comes in other forms, too.

The work reveals how connections in a social network can also be gerrymandered — or manipulated — in such a way that a small number of strategically placed bots can influence a larger majority to change its mind, especially if the larger group is undecided about its voting intentions (A. J. Stewart et al. Nature 573, 117–118; 2019).

The researchers, led by mathematical biologist Alexander Stewart of the University of Houston, Texas, have joined those who are showing how it can be possible to give one party a disproportionate influence in a vote.

It is a finding that should concern us all.

A masterful understandment.

Auditing Radicalization Pathways on YouTube

Manoel Horta Ribeiro, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, Wagner Meira at Cornell University:

Non-profits and the media claim there is a radicalization pipeline on YouTube. Its content creators would sponsor fringe ideas, and its recommender system would steer users towards edgier content. Yet, the supporting evidence for this claim is mostly anecdotal, and there are no proper measurements of the influence of YouTube’s recommender system. In this work, we conduct a large scale audit of user radicalization on YouTube. We analyze 331,849 videos of 360 channels which we broadly classify into: control, the Alt-lite, the Intellectual Dark Web (I.D.W.), and the Alt-right —channels in the I.D.W. and the Alt-lite would be gateways to fringe far-right ideology, here represented by Alt-right channels. Processing more than 79comments, we show that the three communities increasingly share the same user base; that users consistently migrate from milder to more extreme content; and that a large percentage of users who consume Alt-right content now consumed Alt-lite and I.D.W. content in the past. We also probe YouTube’s recommendation algorithm, looking at more than 2M million recommendations for videos and channels between May and July 2019. We find that Alt-lite content is easily reachable from I.D.W. channels via recommendations and that Alt-right channels may be reached from both I.D.W. and Alt-lite channels. Overall, we paint a comprehensive picture of user radicalization on YouTube and provide methods to transparently audit the platform and its recommender system.

Google knows this but does nothing.

The Myth of Consumer-Grade Security

Schneier on Security:

The Department of Justice wants access to encrypted consumer devices but promises not to infiltrate business products or affect critical infrastructure. Yet that’s not possible, because there is no longer any difference between those categories of devices. Consumer devices are critical infrastructure. They affect national security. And it would be foolish to weaken them, even at the request of law enforcement.

In his keynote address at the International Conference on Cybersecurity, Attorney General William Barr argued that companies should weaken encryption systems to gain access to consumer devices for criminal investigations. Barr repeated a common fallacy about a difference between military-grade encryption and consumer encryption: “After all, we are not talking about protecting the nation’s nuclear launch codes. Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.”

The thing is, that distinction between military and consumer products largely doesn’t exist. All of those “consumer products” Barr wants access to are used by government officials — heads of state, legislators, judges, military commanders and everyone else — worldwide. They’re used by election officials, police at all levels, nuclear power plant operators, CEOs and human rights activists. They’re critical to national security as well as personal security.

This wasn’t true during much of the Cold War. Before the Internet revolution, military-grade electronics were different from consumer-grade. Military contracts drove innovation in many areas, and those sectors got the cool new stuff first. That started to change in the 1980s, when consumer electronics started to become the place where innovation happened. The military responded by creating a category of military hardware called COTS: commercial off-the-shelf technology. More consumer products became approved for military applications. Today, pretty much everything that doesn’t have to be hardened for battle is COTS and is the exact same product purchased by consumers. And a lot of battle-hardened technologies are the same computer hardware and software products as the commercial items, but in sturdier packaging.

My emphasis.

YouTube’s Susan Wojcicki: ‘Where’s the line of free speech – are you removing voices that should be heard?’

The Guardian:

The day before we meet, the tech site Gizmodo publishes a piece on how extremist channels remain on YouTube, despite the new policies. In the face of fairly constant criticism, does Wojcicki ever feel like walking away? “No, I don’t. Because I feel a commitment to solving these challenges,” she says. “I care about the legacy that we leave and about how history will view this point in time. Here’s this new technology, we’ve enabled all these new voices. What did we do? Did we decide to shut it down and say only a small set of people will have their voice? Who will decide that, and how will it be decided? Or do we find a way to enable all these different voices and perspectives, but find a way to manage the abuse of it? I’m focused on making sure we can manage the challenges of having an open platform in a responsible way.”

My emphasis.   Her job depends upon her denying that there is a difference between merely uploading a video (and it being lost in the millions of others) and deliberately recommending it to others.   The YouTube recommendation algorithm is simply toxic.  And, like polluters everywhere, they do it because it makes them money.

Superhuman is Spying on You

Mike Davidson:

When you start to think about all of the ways Superhuman can be used to violate privacy, you really wonder why The New York Times spent 1,200 words on a tongue-bath that doesn’t even talk meaningfully about privacy issues at all. We don’t need journalism to tell us where venture capitalists are putting other people’s money. We need it to examine the ramifications of the technology we are pushing into the world and in what ways it might shift the Overton Window for Ethics in either helpful or hurtful ways.

Pithy.

Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites

 

Arunesh Mathur et al, Princeton:

Dark patterns are user interface design choices that benefit an online service by coercing, steering, or deceiving users into making unintended and potentially harmful decisions. We present automated techniques that enable experts to identify dark patterns on a large set of websites. Using these techniques, we study shopping websites, which often use dark patterns these to influence users into making more purchases or disclosing more information than they would otherwise. Analyzing ∼53K product pages from ∼11K shopping websites, we discover 1,841 dark pattern instances, together representing 15 types and 7 categories. We examine the underlying influence of these dark patterns, documenting their potential harm on user decision-making. We also examine these dark patterns for deceptive practices, and find 183 websites that engage in such practices. Finally, we uncover 22 third-party entities that offer dark patterns as a turnkey solution. Based on our findings, we make recommendations for stakeholders including researchers and regulators to study, mitigate, and minimize the use of these patterns.

So depressing.

Fake news ‘vaccine’ works: ‘pre-bunk’ game reduces susceptibility to disinformation

Cambridge University:

An online game in which people play the role of propaganda producers to help them identify real world disinformation has been shown to increase “psychological resistance” to fake news, according to a study of 15,000 participants.

In February 2018, University of Cambridge researchers helped launch the browser game Bad News. Thousands of people spent fifteen minutes completing it, with many allowing the data to be used for a study.

Players stoke anger and fear by manipulating news and social media within the simulation: deploying twitter bots, photo-shopping evidence, and inciting conspiracy theories to attract followers – all while maintaining a “credibility score” for persuasiveness.

“Research suggests that fake news spreads faster and deeper than the truth, so combatting disinformation after-the-fact can be like fighting a losing battle,” said Dr Sander van der Linden, Director of the Cambridge Social Decision-Making Lab.

 

In court, Facebook blames users for destroying their own right to privacy

The Intercept:

Judge Chhabria was skeptical of Snyder’s privacy nonexistence argument at times, which he rejected as treating personal privacy as a binary, “like either you have a full expectation of privacy, or you have no expectation of privacy at all,” the judge put it at one point. Chhabria continued with a relatable hypothetical:

If I share [information] with ten people, that doesn’t eliminate my expectation of privacy. It might diminish it, but it doesn’t eliminate it. And if I share something with ten people on the understanding that the entity that is helping me share it will not further disseminate it to a thousand companies, I don’t understand why I don’t have — why that’s not a violation of my expectation of privacy.

Snyder responded with an incredible metaphor for how Facebook sees your use of its services — legally, at least:

Let me give you a hypothetical of my own. I go into a classroom and invite a hundred friends. This courtroom. I invite a hundred friends, I rent out the courtroom, and I have a party. And I disclose — And I disclose something private about myself to a hundred people, friends and colleagues. Those friends then rent out a 100,000-person arena, and they rebroadcast those to 100,000 people. I have no cause of action because by going to a hundred people and saying my private truths, I have negated any reasonable expectation of privacy, because the case law is clear.

And there it is, in broad daylight: Using Facebook is a depressing party taking place in a courtroom, for some reason, that’s being simultaneously broadcasted to a 100,000-person arena on a sort of time delay. If you show up at the party, don’t be mad when your photo winds up on the Jumbotron. That is literally the company’s legal position.

Don’t pretend you weren’t warned.