Lifestyle / Gadgets

Netflix Social Dilemma is proving Your Data is the New Luxury

The common adage is that, “if the service is free, you are the product”. LUXUO discovers you aren’t the product, changing your behaviour so companies and governments can predict your actions is the product. Your data privacy is the new luxury of the 21st century

Sep 28, 2020 | By Jonathan Ho

Surveillance capitalism is term coined by Harvard professor and social psychologist Shoshana Zuboff in 2014. Surveillance capitalism describes the new market forces which accompany the rise of social media where the commodity for sale is your personal data, and the capture and production of this data relies on mass surveillance; either in the capture of your data when you use “free’ online services like Google or platforms like Facebook; chillingly, it also describes surreptitious means of listening in to conversations with your mobile devices and scanning for your ‘word clouds’ for keywords.

Typically, specialist analytics firms or divisions of data scientists within these companies collect and scrutinise our online behaviours (likes, dislikes, searches, how long we stay on specific posts) to produce data that can be further segment our behavioural proclivities and personality types so that companies can use the information for commercial purposes.

But that’s not the real danger.

Advertisements

The real danger is of Social Media & Big Data is Behaviour Modification

“Some of these data are applied to service (Facebook, Twitter, etc) improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets.” – Shoshana Zuboff

Zuboff, a professor emerita at Harvard Business School, warns that the lights, bells, and whistles of Big Tech & Big Data have made us blind and deaf to the real threats of the digital revolution. The late 20th century has seen our economy move away from mass production lines to become progressively more reliant on knowledge. A data driven economy such as this one is reliant on “big data” to make money.

Shoshana Zuboff, professor emerita at Harvard Business School

The data used in this process is often collected from the same groups of people who will ultimately be its targets. And while commoditisation of this data in terms of selling advertising to consumer goods companies who want to target us is the current method of monetisation, the ultimate goal highlighted in Netflix’s recent documentary, The Social Dilemma, is behavioural modification and prediction – eventually, this level of machine-aided inception-level reprogramming of the human being will result in a a new kind of marketplace that Zuboff calls behavioural futures markets.

The New Luxury: Your Personal Data and Behavioural Futures Markets

“One of our goals was to figure out how to get as much of your attention as we could” – Tristan Harris,  president and a co-founder of the Center for Humane Technology, former Google Design Ethicist

According to WSJ’s Jeff Horwitz and Deepa Seetharaman, internal slides at Facebook showed, “Our algorithms exploit the human brain’s attraction to divisiveness,” It also warned that, “if left unchecked, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform.”

“We took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset.” – Tim Kendall, former Facebook Director of Monetisation

Aza Raskin, co-founder of the Center for Humane Technology, was formerly the head of user experience at Mozilla Labs and lead designer for Firefox. Known for inventing the infinite scroll, Raskin is no stranger to keeping our attention – Engagement, being one of three key goals at tech companies beside Growth and Advertising. According to Raskin, the race is to build a better model at predicting human behaviour and the most precise model wins. These addiction and manipulation based services are what Sean Parker, the first president of Facebook and famed co-founder of Napster calls, “exploiting vulnerabilities in human psychology.”

Indeed, companies like Google and Facebook would roll out dozens of tiny experiments on users. And over time, they develop the most optimal way of getting users to do what they want them to do. But when innovations like infinite scrolling and like buttons no longer become sufficient to keep you engaged, manipulation starts to take a darker form – trigger our evolutionary instincts for tribalism.

A recent Pew research study showed that in the United States, personal and political polarisation is at a 20 year high, and this is no coincidence. When Facebook conducted what they called massive scale contagion experiments, they found that through subliminal cues on Facebook pages, they were able to get more people to go vote in the midterm elections, and once they discovered that they were able to do that; they concluded that could affect real world behaviour and emotions without ever triggering the user’s awareness.

“These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism.” – Tim Kendall

Modern (Psych) Warfare: Captology and the Stanford Behavior Design Lab

The Stanford Behavior Design Lab, led by Dr. BJ Fogg, creates insight into how computing products – from websites to mobile phone software – can be designed to change what people believe and what they do. While this can bring about positive changes in many domains, the many graduates who move on to Silicon Valley tech companies might not always use captology, that is the body of expertise in the design, theory, and analysis of persuasive technologies, might not even understand the ramifications of how an amoral Artificial Intelligence might use their understanding and apply it to the goals of engagement, growth and advertising.

“Behavior happens when Motivation, Ability, and a Prompt come together at the same moment.” – Dr. Fogg, Stanford Behavior Design Lab

Mentalism is a performing art in which its practitioners, known as mentalists, appear to demonstrate highly developed mental or intuitive abilities. In the early days, these “magicians”  used their ability to read body language in order to manipulate a subject subliminally through psychological suggestion: essentially, it’s a variation of suggesting someone “pick a card, any card” but the truth was, you had already predisposed them to picking a particular card. However, it’s one thing when a trick of mentalism is performed for entertainment, quite another when super-computers and algorithms equipped with a near quantitative profile of your quirks and proclivities are given to a learning machine to figure out what would push your buttons best.

Once such person who found herself at the end of the barrel of potent super-computing and artificial intelligence was Allison Gill. Holding three degrees including a doctorate, the member of Mensa and graduate of the US Naval Nuclear Power Training command discovered that as highly educated and intelligent that she was, she had almost been duped by GOP and the Kremlin into throwing away her vote in 2016. In those days, Gill was a Bernie supporter and spending so much time with other Bernie supporters on Facebook and Twitter, her shares and other social media activities had made her a vulnerable target for psychometric profiling.

As time went on, Gill soon discovered that her entire newsfeed was bombarding her with negative information about Hillary and the Democrats, and soon she became convinced that the DNC was evil, the system was rigged, and that the only way out was to “send a message” to the establishment by bucking the system with a protest vote. The lifelong democrat, left the Democratic party and vowed to write in a candidate she knew couldn’t win because she was convinced she would “send a message” until I realised that throwing her vote away would not only NOT send a message, it would contribute to the election of Donald Trump.

In fact, over time, the narrative would become more sinister: that Trump victory would be fine because it would burn the system down and send a message. Right now, the Gill sees the same pattern repeating itself and now, the narrative is one which blames the failures of President Trump on House Speaker Nancy Pelosi, the Mueller investigation and the failures of House Democrats, rather than the missteps and mistakes of the individual himself. Chillingly, polarisation is once again used to create extreme left and extreme right ideologues who fail to realise that the common destiny of their home country is at stake, just so you would spend more time on social media. It’s a phenomena that is also driving conversations that are not “triggered” by an algorithm as well. As our echo chambers deepen, so does our propensity to seek out and share information which confirms our world views to other like minded individuals.

Guardian columnist Charles Arthur believes members of parliament in Westminster are not immune either. Hw writes, “WhatsApp groups mark Westminster’s tribal lines; the Labour and Tory MPs who left to form the Independent Group were apparently thrown out of their respective party-oriented WhatsApp groups in a move as ceremonial as the breaking of a cashiered soldier’s sword.”  He hypothesises, “What if using the WhatsApp messaging service means the European Research Group is in effect radicalising its Brexiter Tory members, so they egg each other on to take more and more extreme positions in pushing for no deal? What if groups on Facebook are giving people the chance to say things they wouldn’t consider saying aloud in public?”

While The Verge argues that the world is more complicated than filmmakers of The Social Dilemma want to believe, ironically the author of the essay – Casey Newton, dismantles her own argument with the example of Sophie Zhang, a data scientist for the Facebook Site Integrity fake engagement team and someone dealt with “bots influencing elections and the like”. Zhang describes “coordinated influence campaigns” by governments in Azerbaijan and Honduras, including India, Ukraine and Bolivia, using Facebook against their own citizens. Yes indeed, the world is comprised of geopolitical and socio-economic policy tools beyond Facebook but social media platforms are the battlefield at which the battles for our minds and hearts are waged. All the major civil wars have been fought because two major segments of society had diverged to the point of irreconcilable extreme ideological positions.

“Civil war?” – Tim Kendall, when asked in Social Dilemma what his biggest fear was

https://www.facebook.com/Slate/posts/10158510822716438

According to UC Berkley, online filter bubbles that expose us to the ideas we already agree with are consistent with a broader psychological literature on confirmation bias, showing that we are more likely to seek out and agree with views that align with our pre-existing beliefs. Since the algorithms currently feed this bias and reward us with a dose of dopamine and self-confidence at having our assumptions proven right (even if they are factually wrong), having our newsfeeds algorithmically deliver our preferred news sites and op-editorials potentially makes it easier to listen to groups or individuals who validate our own world-views. Case in point, in the US Presidential Elections, Trump has spent millions on misleading Facebook ads targeting undecided voters. While a critic can argue that Trump can spend millions on any platform, media like television don’t have programmatic processes to deliver targeted disinformation to people already pre-disposed to believing that information – as evidenced by Ms. Allison Gill, even extremely educated and critical thinking individuals can fall prey to AI’s ability to accurately map and deliver information that is designed to trigger us.

Facebook has many anti-polarisation initiatives and while owner and founder Mark Zuckerberg has stated that he’s not inclined that the platform makes editorial judgments on speech, they are guilty if not directly complicit in making algorithmic choices used in the spread of polarising speech. In real terms, social media platforms are catalysts for the resurgent growth of anti-science flat earthers, #pizzagate conspiracy theorists, and hate groups. The company’s own 2016 internal presentation say it best: “64% of all extremist group joins are due to our recommendation tools” – “Our recommendation systems grow the problem.”

 


 
Back to top