Carole Cadwalladr in The Guardian does a useful roundup of her horrifying investigation into influencing the electoral and referendum process.
What’s been lost in the US coverage of this “data analytics” firm is the understanding of where the firm came from: deep within the military-industrial complex. A weird British corner of it populated, as the military establishment in Britain is, by old-school Tories. Geoffrey Pattie, a former parliamentary under-secretary of state for defence procurement and director of Marconi Defence Systems, used to be on the board, and Lord Marland, David Cameron’s pro-Brexit former trade envoy, a shareholder.
Steve Tatham was the head of psychological operations for British forces in Afghanistan. The Observer has seen letters endorsing him from the UK Ministry of Defence, the Foreign Office and Nato.
SCL/Cambridge Analytica was not some startup created by a couple of guys with a Mac PowerBook. It’s effectively part of the British defence establishment. And, now, too, the American defence establishment. An ex-commanding officer of the US Marine Corps operations centre, Chris Naler, has recently joined Iota Global, a partner of the SCL group.
This is not just a story about social psychology and data analytics. It has to be understood in terms of a military contractor using military strategies on a civilian population. Us. David Miller, a professor of sociology at Bath University and an authority in psyops and propaganda, says it is “an extraordinary scandal that this should be anywhere near a democracy. It should be clear to voters where information is coming from, and if it’s not transparent or open where it’s coming from, it raises the question of whether we are actually living in a democracy or not.”
THE EFFECT I cut a deck of cards a couple of times, and you glimpse flashes of several different cards. I turn the cards facedown and invite you to choose one, memorize it and return it. Now I ask you to name your card. You say (for example), “The queen of hearts.” I take the deck in my mouth, bite down and groan and wiggle to suggest that your card is going down my throat, through my intestines, into my bloodstream and finally into my right foot. I lift that foot and invite you to pull off my shoe and look inside. You find the queen of hearts. You’re amazed. If you happen to pick up the deck later, you’ll find it’s missing the queen of hearts.
Teller (the smaller, quieter half of Penn & Teller) gives the best explanation of a magic trick I’ve seen.
Facebook has publicly acknowledged that its platform has been exploited by governments seeking to manipulate public opinion in other countries – including during the presidential elections in the US and France – and pledged to clamp down on such “information operations”.
In a white paper authored by the company’s security team and published on Thursday, the company detailed well-funded and subtle techniques used by nations and other organizations to spread misleading information and falsehoods for geopolitical goals. These efforts go well beyond “fake news”, the company said, and include content seeding, targeted data collection and fake accounts that are used to amplify one particular view, sow distrust in political institutions and spread confusion.
“We have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people,” said the company.
In its effort to clamp down on information operations, Facebook suspended 30,000 accounts in France before the presidential election. The company said it was a priority to remove suspect accounts with high volumes of posting activity and the biggest audiences.
The company also explained how it monitored “several situations” that fit the pattern of information operations during the US presidential election. The company detected “malicious actors” using social media to share information stolen from other sources such as email accounts “with the intent of harming the reputation of specific political targets”. This technique involved creating dedicated websites to host the stolen data and then creating social media accounts and pages to direct people to it.
Some progress at last. They now appear to recognize that they’ve been used. As @Pinboard noted, the 3 threats the doc outlines—Content Creation, False Amplification, and Targeted Data Collection—are literally the Facebook business model.
Via John Naughton, an excerpt from The New York Times about Facebook by Farhad Manjoo:
The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth. And it is a particular kind of truth: The News Feed team’s ultimate mission is to figure out what users want — what they find “meaningful,” to use Cox and Zuckerberg’s preferred term — and to give them more of that.
The Atlantic looks at online pricing.
I am getting seriously worried that pricing may be the next fake news – what we thought was reliable is just er fake. I see no sign of regulators (or economic theory) addressing the problem. Part of the problem is obviously an asymmetry of knowledge – most web sellers (Amazon, I’m looking at you) know so much more than an individual consumer. I’m not convinced this metric is even on the UK Competition & Markets Authority’s radar. Maybe one solution is to force the historic publication of prices?
Christian Payne is a photographer who teaches organisations like the BBC, the UN and Al Jazeera how to do in-the-field reporting using mobile phones.
Personally, I think he’s a genius as he orders pizza to his train seat.
Via Charles Arthur, Maciej Cieglowski in Idle Words:
The online world forces individuals to make daily irrevocable decisions about their online footprint.
Consider the example of the Women’s March. The March was organized on Facebook, and 3-4 million people attended. The list of those who RSVP’d is now stored on Facebook servers and will be until the end of time, or until Facebook goes bankrupt, or gets hacked, or bought by a hedge fund, or some rogue sysadmin decides that list needs to be made public.
Any group that uses Facebook to organize comes up against this problem. But keeping this data around forever is not central to Facebook’s business model. The algorithms Facebook uses for targeting favor recency; and their output won’t drastically change if Facebook forgets what you were doing three months or three years ago.
We need the parts of these sites that are used heavily for organizing, like Google Groups or Facebook event pages, to become more ephemeral. There should be a user-configurable time horizon after which messages and membership lists in these places evaporate. These features are sometimes called ‘disappearing’, but there is nothing furtive about it. Rather, this is just getting our software to more faithfully reflect human life.