Carole Cadwalladr in The Guardian does a useful roundup of her horrifying investigation into influencing the electoral and referendum process.
What’s been lost in the US coverage of this “data analytics” firm is the understanding of where the firm came from: deep within the military-industrial complex. A weird British corner of it populated, as the military establishment in Britain is, by old-school Tories. Geoffrey Pattie, a former parliamentary under-secretary of state for defence procurement and director of Marconi Defence Systems, used to be on the board, and Lord Marland, David Cameron’s pro-Brexit former trade envoy, a shareholder.
Steve Tatham was the head of psychological operations for British forces in Afghanistan. The Observer has seen letters endorsing him from the UK Ministry of Defence, the Foreign Office and Nato.
SCL/Cambridge Analytica was not some startup created by a couple of guys with a Mac PowerBook. It’s effectively part of the British defence establishment. And, now, too, the American defence establishment. An ex-commanding officer of the US Marine Corps operations centre, Chris Naler, has recently joined Iota Global, a partner of the SCL group.
This is not just a story about social psychology and data analytics. It has to be understood in terms of a military contractor using military strategies on a civilian population. Us. David Miller, a professor of sociology at Bath University and an authority in psyops and propaganda, says it is “an extraordinary scandal that this should be anywhere near a democracy. It should be clear to voters where information is coming from, and if it’s not transparent or open where it’s coming from, it raises the question of whether we are actually living in a democracy or not.”
Indeed, one particularly interesting section of Israeli society provides a unique laboratory for how to live a contented life in a post-work world. In Israel, a significant percentage of ultra-orthodox Jewish men never work. They spend their entire lives studying holy scriptures and performing religion rituals. They and their families don’t starve to death partly because the wives often work, and partly because the government provides them with generous subsidies. Though they usually live in poverty, government support means that they never lack for the basic necessities of life.
That’s universal basic income in action. Though they are poor and never work, in survey after survey these ultra-orthodox Jewish men report higher levels of life-satisfaction than any other section of Israeli society. In global surveys of life satisfaction, Israel is almost always at the very top, thanks in part to the contribution of these unemployed deep players.
You don’t need to go all the way to Israel to see the world of post-work. If you have at home a teenage son who likes computer games, you can conduct your own experiment. Provide him with a minimum subsidy of Coke and pizza, and then remove all demands for work and all parental supervision. The likely outcome is that he will remain in his room for days, glued to the screen. He won’t do any homework or housework, will skip school, skip meals and even skip showers and sleep. Yet he is unlikely to suffer from boredom or a sense of purposelessness. At least not in the short term.
Hence virtual realities are likely to be key to providing meaning to the useless class of the post-work world. Maybe these virtual realities will be generated inside computers. Maybe they will be generated outside computers, in the shape of new religions and ideologies. Maybe it will be a combination of the two. The possibilities are endless, and nobody knows for sure what kind of deep plays will engage us in 2050.
In any case, the end of work will not necessarily mean the end of meaning, because meaning is generated by imagining rather than by working. Work is essential for meaning only according to some ideologies and lifestyles. Eighteenth-century English country squires, present-day ultra-orthodox Jews, and children in all cultures and eras have found a lot of interest and meaning in life even without working. People in 2050 will probably be able to play deeper games and to construct more complex virtual worlds than in any previous time in history.
Consider the conditions that allow for tacit collusion. First, the market is concentrated and hard for others to enter. The petrol stations on the Vineyard were cut off from the mainland. Second, prices are transparent in a way that renders any attempt to steal business by lowering prices self-defeating. A price cut posted outside one petrol station will soon be matched by the others. And if one station raises prices, it can always cut them again if the others do not follow. Third, the product is a small-ticket and frequent purchase, such as petrol. Markets for such items are especially prone to tacit collusion, because the potential profits from “cheating” on an unspoken deal, before others can respond, are small.
Now imagine what happens when prices are set by computer software. In principle, the launch of, say, a smartphone app that compares prices at petrol stations ought to be a boon to consumers. It saves them the bother of driving around for the best price. But such an app also makes it easy for retailers to monitor and match each others’ prices. Any one retailer would have little incentive to cut prices, since robo-sellers would respond at once to ensure that any advantage is fleeting. The rapid reaction afforded by algorithmic pricing means sellers can co-ordinate price rises more quickly. Price-bots can test the market, going over many rounds of price changes, without any one supplier being at risk of losing customers. Companies might need only seconds, and not days, to settle on a higher price, note Messrs Ezrachi and Stucke.
THE EFFECT I cut a deck of cards a couple of times, and you glimpse flashes of several different cards. I turn the cards facedown and invite you to choose one, memorize it and return it. Now I ask you to name your card. You say (for example), “The queen of hearts.” I take the deck in my mouth, bite down and groan and wiggle to suggest that your card is going down my throat, through my intestines, into my bloodstream and finally into my right foot. I lift that foot and invite you to pull off my shoe and look inside. You find the queen of hearts. You’re amazed. If you happen to pick up the deck later, you’ll find it’s missing the queen of hearts.
Teller (the smaller, quieter half of Penn & Teller) gives the best explanation of a magic trick I’ve seen.
In the Uber model (and many other gig economy platform models) the platform agencies are the ones who collect the cash first and then distribute it back (less fees) to drivers on a weekly basis.
While many might deem this an innocuous difference, it isn’t really. The platforms gain a significant cash-flow advantage over the drivers as a consequence. The arrangement also entirely flips the power and credit risk distributions on their head. It is not the drivers, under the model, who are purchasing services from third-party dispatch platforms, it is by all logical means the platforms who are purchasing services from drivers.
Facebook has publicly acknowledged that its platform has been exploited by governments seeking to manipulate public opinion in other countries – including during the presidential elections in the US and France – and pledged to clamp down on such “information operations”.
In a white paper authored by the company’s security team and published on Thursday, the company detailed well-funded and subtle techniques used by nations and other organizations to spread misleading information and falsehoods for geopolitical goals. These efforts go well beyond “fake news”, the company said, and include content seeding, targeted data collection and fake accounts that are used to amplify one particular view, sow distrust in political institutions and spread confusion.
“We have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people,” said the company.
In its effort to clamp down on information operations, Facebook suspended 30,000 accounts in France before the presidential election. The company said it was a priority to remove suspect accounts with high volumes of posting activity and the biggest audiences.
The company also explained how it monitored “several situations” that fit the pattern of information operations during the US presidential election. The company detected “malicious actors” using social media to share information stolen from other sources such as email accounts “with the intent of harming the reputation of specific political targets”. This technique involved creating dedicated websites to host the stolen data and then creating social media accounts and pages to direct people to it.
Some progress at last. They now appear to recognize that they’ve been used. As @Pinboard noted, the 3 threats the doc outlines—Content Creation, False Amplification, and Targeted Data Collection—are literally the Facebook business model.
The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth. And it is a particular kind of truth: The News Feed team’s ultimate mission is to figure out what users want — what they find “meaningful,” to use Cox and Zuckerberg’s preferred term — and to give them more of that.
The Atlantic looks at online pricing.
I am getting seriously worried that pricing may be the next fake news – what we thought was reliable is just er fake. I see no sign of regulators (or economic theory) addressing the problem. Part of the problem is obviously an asymmetry of knowledge – most web sellers (Amazon, I’m looking at you) know so much more than an individual consumer. I’m not convinced this metric is even on the UK Competition & Markets Authority’s radar. Maybe one solution is to force the historic publication of prices?
Christian Payne is a photographer who teaches organisations like the BBC, the UN and Al Jazeera how to do in-the-field reporting using mobile phones.
Personally, I think he’s a genius as he orders pizza to his train seat.
The online world forces individuals to make daily irrevocable decisions about their online footprint.
Consider the example of the Women’s March. The March was organized on Facebook, and 3-4 million people attended. The list of those who RSVP’d is now stored on Facebook servers and will be until the end of time, or until Facebook goes bankrupt, or gets hacked, or bought by a hedge fund, or some rogue sysadmin decides that list needs to be made public.
Any group that uses Facebook to organize comes up against this problem. But keeping this data around forever is not central to Facebook’s business model. The algorithms Facebook uses for targeting favor recency; and their output won’t drastically change if Facebook forgets what you were doing three months or three years ago.
We need the parts of these sites that are used heavily for organizing, like Google Groups or Facebook event pages, to become more ephemeral. There should be a user-configurable time horizon after which messages and membership lists in these places evaporate. These features are sometimes called ‘disappearing’, but there is nothing furtive about it. Rather, this is just getting our software to more faithfully reflect human life.