- CounterPunch.org - https://www.counterpunch.org -

Don’t be Out-Zucked by Facebook

Many people at Facebook call their boss – Mark Zuckerberg, valued at $82.3bn – simply Zuck. After rafts of recent scandals, many have come to the conclusion, that people’s trust in Facebook had been misplaced, that Facebook is a threat, and that it undermines democracy. Overall, three issues have enabled the stratospheric success of Facebook: relentless surveillance; sharing (i.e. selling) of user data; and what the industry, rather euphemistically, calls persuasive technologies which, in reality, is behavior modification boiling down to behavior manipulation.

In March 2016, the news reported on a group that exploited a programming tool on Facebook to gather data on users expressing an interest in the Black Lives Matter data that they then sold to police departments, which struck me as evil. In another incident, Facebook’s advertising tools enabled property owners to discriminate based on race in violation of the Fair Housing Act.

Even more scary is the fact that Facebook gives advantages to campaign messages based on fear or anger over those based on neutral or positive emotions. This can tilt elections. With 2.89bn users per month globally, Facebook has become a powerful news source in most democratic countries.

Having started Facebook.com on 4th February 2004 and given Facebook’s global and monetary success ever since Facebook feels aloof. Facebook protects what it has. It follows what strategic management calls a defender strategy. This means to construct Facebook like a castle with six concentric layers of walls. Facebook’s walls grow higher all the time, and on top of them, Facebook has fortified itself with all three of the other known defensibilities in the internet age: brand, scale, and embedding.

One of the keys to Facebook is to embed users into its system, performing Borg-like tasks. It makes users think they see balanced content when, in fact, they were trapped in what is called a filter bubble created and enforced by algorithms. Hidden behind Facebook’s “like button” is a system that delivers data on emotional triggers to Facebook. Facebook converts users into products that can be sold to advertisers. Bingo! You are out-Zucked, and Zuckerberg cashes in.

Long ago, Facebook had recognized that the really big money comes from attracting advertisers who had historically spent giant budgets on traditional media. Hence the decline of newspapers and the rise of Facebook’s profits. With that, Facebook became a juggernaut. Facebook combines psychology, persuasion techniques, and propaganda with techniques from slot machines, like rewards tying them to the human social need for approval and validation in ways that few users can resist.

Progress in the area of AI has assisted Facebook. Facebook’s use of artificial intelligence includes behavioral prediction engines that anticipate our thoughts and emotions, based on patterns found in the reservoir of data they have accumulated about users. In other words, users effectively serve artificial intelligence – rather than the other way around. People think they use Facebook, but in fact, Facebook uses you! You are out-Zucked.

Combine this with the power of algorithms; it becomes clear why Facebook’s AI knows more about users than they can imagine. Facebook knows that sensational headlines work better than calms descriptions of events. It also knows that outraged users share more content to let other people know what they should also be outraged about. Facebook knows this because it conducts research on these issues.

In 2014, Facebook conducted a massive experiment without getting prior informed consent or providing any warning. Facebook made people sad just to see if it can be done. Facebook did not apologize for running a giant psychological experiment on users. It also found that in such an environment, the loudest voices dominate, which can be intimidating. Furthermore, Facebook’s News Feed enables every user to surround him- or herself with likeminded people.

Facebook loves groups of likeminded people because they enable easy targeting by advertisers. Besides, sociology knows that when likeminded people discuss issues, their views tend to get more extreme over time. Potentially, this creates extremists. For Facebook, this is only a little side-issue. For democracy, this can be devastating. It rewards the worst social behavior. Extreme views attract more attention, so platforms recommend them.

Facebook also knows that people in a filter bubble become increasingly tribal, isolated, and extreme. Worse, they disregard expertise in favor of voices from their tribe. This is how a large minority of Americans abandoned newspapers in favor of talk radio and websites that peddle conspiracy theories. Filter bubbles and preference bubbles undermine democracy by eliminating the last vestiges of common ground among a huge percentage of Americans. The tribe is all that matters, and anything that advances the tribe is legitimate.

However, Facebook does not create preference bubbles, but it is the ideal incubator for them. The algorithms ensure that users who like one piece of disinformation will be fed more disinformation. Facebook does so not because conspiracy theories are good for you but because conspiracy theories are good for them, i.e. Facebook. In other words, Facebook’s business model and the way it makes billions of dollars damages democracy. Along the way, Facebook has poisoned political discourse in democracies around the world.

Facebook can do all this because it exists in a largely unregulated market, even though for Facebook, there is no market. Facebook is a true monopoly. Commonly, this is camouflaged through the ideology of neoliberalism. Nonetheless, the former editor of Harvard Business Review once made the following stunning revelation when talking about management:

Business executives are society’s leading champions of free markets and competition, words that, for them, evoke a worldview and value system that rewards good ideas and hard work, and that fosters innovation and meritocracy. Truth be told, the competition every manager longs for is a lot closer to Microsoft’s end of the spectrum than it is to the dairy farmers. All the talk about the virtues of competition notwithstanding, the aim of business strategy is to move an enterprise away from perfect competition and in the direction of monopoly.

Like other monopolies, technology monopolies can be regulated, although corporate lobbying and corporate media have anchored the urban myth that technology corporations cannot be regulated. There is a popular misconception that regulation does not work with technology. The argument consists of four flawed premises:

1) regulation cannot keep pace with rapidly evolving technology;

2) government intervention always harms innovation;

3) regulators can never understand tech well enough to provide oversight; and

4) the market will always allocate resources best.

An F-35 Jet, Disinformation, and the Number 77,744

Regulating tech-monopolies like Facebook might be urgently needed to prevent people like Donald Trump from leaving TV shows and pretending to be a president. The Toddler in Chief profited from the Russian interference into the election. He alone campaigned on their topics: anti-immigration, white racism, ultra-nationalism, and right-wing populism. Russia has used Facebook to undermine democracy and influence a presidential election for roughly one hundred million dollars, or less than the price of a single F-35 fighter. It paid off handsomely. Furthermore, it produced disinformation working its way through the American population by following five routes:

1) disinformation and conspiracy theories are often incubated on sites like Reddit, 4chan, 8chan, and Facebook;

2) then they turn to Twitter;

3) if no journalist picked up the story, people on Twitter post new messages saying, read the story that the mainstream media doesn’t want you to know about;

4) eventually, some legitimate journalist may write about the story; and finally,

5) once disinformation is trending, other news outlets will pick it up. At that point, it’s time to go for the mass market, which means Facebook, where the Russians would have placed the story in the Facebook groups they controlled.

Facebook has very little incentive to remove disinformation, conspiracy theories, and filter bubbles because they improve metrics, the time spent on a site, engagement, and sharing. Despite all this, Facebook continues to argue that Facebook is simply about technology and that Facebook values neutrality. Unfortunately, the evidence suggests this is not quite the whole truth. Facebook itself admitted that 126 million users had been exposed to Russian interference.

The user number represents more than one-third of the US population, but that grossly understates its impact. The Russians did not reach a random set of 126 million people on Facebook. Their efforts were highly targeted. It discouraged Democratic voters from voting. The fact that four million people who voted for Obama in 2012 did not vote for Clinton in 2016 may reflect the effectiveness of Russia’s interference.

In an election where only 137 million people voted, a campaign that targeted 126 million eligible voters almost certainly had an impact. The importance of voter suppression becomes even more significant if one recalls the following. In 2016, the winner in the electoral college lost the popular tally by nearly three million votes in three states, which Trump won by a total of 77,744 votes.

Is it possible that Facebook’s engagement has influenced the outcome? Yes. It’s virtually impossible that it didn’t. In other words, Facebook’s top team, including Zuck, has had a decisive influence on US democracy. Hopefully, in 2020, American voters will not be out-Zucked again.

Roger McNamee’s book Zucked is published by Penguin.