Addiction and Microtargeting: How “Social” Networks Expose us to Manipulation

Photo Source Kevin Dooley | CC BY 2.0

Roger McNamee, an early investor of Facebook, has stated that to keep us occupied such platforms “have taken all the techniques of Edward Bernays and Joseph Goebbels, and all of the other people from the world of persuasion, and all the big ad agencies, and they’ve mapped it onto an all day product with highly personalised information in order to addict you.”

According to McNamee, “we are all to one degree or another addicted.” In a CNBC interview, McNamee says that he is “terrified” by the “damaging, unhealthy practices of these companies.”

In a 2017 interview Sean Parker, the founding president of Facebook, stated that the “thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible’?”

Parker calls the reliance on comments and “likes” in such platforms as giving users “a little hit of dopamine” which creates “social-validation feedback loops” that ultimately exploit “a vulnerability in human psychology.”

The negative effects of “social” media on our cognitive capacity, confirmed through an increasing number of studies, have prompted key designers, engineers, and product managers  who were directly involved in the creation of such platforms to denounce them.

However, growing public criticism of social media companies’ predatory practices hasn’t lead to any radical changes. As Paul Lewis observes in his article on social media’s effects on our minds, “refuseniks are rarely founders or chief executives, who have little incentive to deviate from the mantra that their companies are making the world a better place.”

In the same article, Lewis describes how Justin Rosenstein, an American software programmer who played a role in creating the “like” button on Facebook, and his peers draw “a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump.” According to them, “digital forces have completely upended the political system.”

These warnings from people who were instrumental in Facebook’s development are a clear sign that propaganda’s marriage with new technologies extends well beyond the presses and the TV screens, and into the world of social media and Big Data, where we are targeted individually through our addiction-fueled online habits.

“A Threat to an Individual’s Well-Being, Freedom, or even Life”

In “The Data That Turned The World Upside Down,” Hannes Grassegger and Mikael Krogerus, two reporters from Zurich-based Das Magazin, dive deep into the lucrative business of Big Data where political and corporate opportunists use our online habits in groundbreaking  and profitable ways.

The authors begin their article with Michael Kosinski, who in 2008 went to do his PhD at Cambridge University’s Psychometrics Centre (psychometrics is a field that focuses on measuring psychological traits). As part of his studies, Kosinski and other researchers began developing a method of analyzing people in minute detail through “psychological profiles” based on one’s Facebook activity.

According to the article, around 2012 Kosinski’s team proved that “it is possible to predict someone’s skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent)” just on the basis of an average of 68 Facebook “likes” by a user.

“But it didn’t stop there,” the authors state, “intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined…it was even possible to deduce whether someone’s parents were divorced.”

As he came to terms with the implications of his team’s research, Kosinski began to add warnings to his papers that his approach “could pose a threat to an individual’s well-being, freedom, or even life.”

Key here, according to Grassegger and Krogerus, was the finding that Big Data can be used as a “people search engine” where third parties can target specific segments of the population based on their digital footprints.

It wasn’t long until Kosinski was approached by a company linked to Cambridge Analytica, which had previously worked with the election campaigns for Republicans Ben Carson, Ted Cruz, and Trump.

Kosinski ultimately chose to accept a position at Stanford, where he is now actively involved in discussions about the influence of Big Data on elections, the realities of living in an “echo chamber,” and the inevitability of a “post-privacy” world.

In other circles, however, the possibility of knowing what tics large segments of the population was something to fight for. Emails revealed in a May 2018 article published in The Guardian (a part of a series on Cambridge Analytica) show that professors from Cambridge University were upset about Cambridge Analytica’s possible use of university resources.

The emails from 2014 show concern among university psychologists over the activities of Aleksandr Kogan, an assistant professor and colleague of Kosinski who, at the time,  was involved in talks with the parent company of Cambridge Analytica—Strategic Communication Laboratories.

“Prof John Rust, the director at Cambridge University’s Psychometrics Centre, which had pioneered the study of psychology through large-scale data analysis, wrote to Kogan after a face-to-face meeting about a dispute with the two academics,” states the article.

“Rust accused Kogan of trying to make $1m in ‘personal profit in terms of asset and data’ from the scheme, while only reimbursing his fellow psychologists, Dr Michal Kosinski and Dr David Stillwell, who had led much of the cutting-edge research, with $100,000.”

It’s no secret that Cambridge Analytica was heavily funded by hedge-fund billionaire Robert Mercer, a major supporter of American neocon frenemies Ted Cruz and Donald Trump. This makes the conflict between academia and the “free market” quite significant, considering the increasing influence billionaires like the Koch Brothers have had in universities like George Mason.

The concerns of psychometricians are not unfounded. The role of companies like Cambridge Analytica, which filed for bankruptcy in 2018, was to buy people’s personal data through “globally active data brokers like Acxiom and Experian” and through “surveys on social media,” similarly to the ones used for Kosinski’s research, and to use that data to understand what messages resonate with segments of the U.S. population.

Even though this practice has been around for some time, with Obama and Clinton’s campaigns both using Big Data to reach their target audiences, Cambridge Analytica’s publicized involvement in the Trump 2016 campaign and efforts linked to Brexit has sparked interest in the moral implications of predicting individual attributes from digital records of behavior without our explicit content.

Audience First, Content Later

In his 2016 Concordia Summit speech given just weeks before the U.S. presidential election, Alexander Nix, CEO of Cambridge Analytica, candidly explained how his company’s “revolutionary approach to audience targeting, data modeling, and psychographic profiling” made them a leader in “behavioral microtargeting for election processes around the world.”

“Communication is fundamentally changing,” Nix says in the video. “Back in the days of Mad Men communication was essentially top down…That means that brilliant minds get together to come up with slogans…and they pushed these messages onto the audience in the hope that they resonate. Today, we don’t need to guess which creative solution may or may not work…we can use hundreds and thousands of individual data points on our target audiences to understand exactly which messages will appeal to which audiences way before the creative process even starts.”

All of these data points are aggregated and synthesized in ways that allow companies to target audiences with extreme precision.

“Now we know that we need a message on gun rights, it needs to be a persuasive message, and needs to be nuanced according to the certain personality we are interested in. If we wanted to drill down further,” Nix explains, “we could resolve the data to an individual level, where we have somewhere close to four or five thousand data points on every adult in the United States.”

In the video, Nix takes a pause and proceeds to talk more about the future of targeted advertising.

Nix’s statements suggest that propaganda about segments of the population, such as the millennial birth cohort, relies on a network of data sets that can extrapolate data much more efficiently than demographic information alone.

In fact, Nix considers targeting based solely on demographics, i.e. “that all women should receive the same message because of their gender—or all African Americans because of their race” a “ridiculous idea,” stating that his own children will “never, ever understand this concept of mass communication.”

The Future of Personalized Political Marketing

“It doesn’t really matter much if they were inspired by my work or stumbled on these ideas independently” Kosinski says in one interview about Cambridge Analytica’s connection to his research. “What matters is what they were doing and how they were going about it.”

Kosinski cautions that the use of psychographics might be trumped up by people who need a “scapegoat.” “Back when Obama used similar methods just calling them different names, no liberals were losing their sleep,” he states. “They also did not care when Hillary was spending way more money on personalised political marketing delivered by people way more competent than those working for Trump.”

In the same interview, Kosinski points out the positive aspects of personalised messages. “With both Donald Trump and Bernie Sanders, they attracted huge followings among groups of people who previously were not politically active.”

Cambridge Analytica’s exploitation of data illustrates how corporations have built direct access to our needs and desires which are revealed, consciously and unconsciously, through our online behavior—a feat that extends well beyond Edward Bernays’ wildest dreams.

At the same time, personalized marketing can also be used to organize around constructive causes. Platforms like GoFundMe and Patreon have shown that people can come together for the common good without the holy spirit of the free market, or mysterious algorithms, facilitating the process.

Yet, we still find ourselves surrounded by a constant stream of propagandist content generated by an ever-monopolizing news industry and mediated through platforms like Facebook. This creates an atmosphere of constant self-doubt which is fertile ground for any propagandist who wants to sway public opinion through “social” networks that spy on us and sell our data.

A solution to this problem would not be another surveillance system masquerading as a social network, but a transparent and independent platform that is not captured by the corporate state’s propaganda apparatus.

While there are plenty of alternative ideas, proposals, and projects addressing our broken news industry and toxic social media environment, the social media networks that have, by design, addicted and infiltrated our society will continue to buy or outspend their competitors to ensure their place in the market. How long this strategy will hold is anybody’s guess.

In July 2018, Facebook announced it will update its terms relating to data providers and agencies that create, upload, and then share certain Custom Audiences on behalf of advertisers. This came after Mark Zuckerberg’s much publicized hearings on Capitol Hill in which he took full responsibility for Facebook’s failure to respond to “Russian interference.

Will these “new requirements” prevent the next oligarch, Russian and otherwise, from exploiting Facebook’s profitable ad targeting?