Playing Algorithm ’n Blues

We all live in the Age of the Algorithm.

So here’s a story that not only encapsulates the age, but dwells on how the algorithm obsession can go horribly wrong.

It all started when Facebook censored the iconic photo of “napalm girl” Kim Phuch, which became a symbol of the Vietnam War recognized all over the world. The photo was featured in a Facebook post by Norwegian writer Tom Egeland, who wanted to start a debate on “seven photographs that changed the history of war”.

Not only his post was erased; Egeland was also suspended from Facebook.

Aftenposten, the number one Norwegian daily, owned by Scandinavian media group Schibsted, duly relayed the news, alongside the photo.

Facebook then asked the paper to erase the photo – or to render it unrecognizable in its online edition. Yet even before the paper responded, Facebook censored the article as well as the photo in Aftenposten’s Facebook page.

Norwegian Prime Minister Erna Solberg protested it all on her Facebook page. She was also censored.

Aftenposten then slapped the whole story on its front page, alongside an open letter to Facebook founder Mark Zuckerberg signed by the newspaper director, Espen Egil Hansen, accusing Facebook of abuse of power.

It took a long 24 hours for the Palo Alto colossus to back off and “unblock” the publishing.

An opinion wrapped up in code

Facebook had to be engaged in much post-fact damage control. That does not change the fact the “napalm girl” imbroglio is a classic algorithm drama, as in the application of artificial intelligence to evaluate content.

Facebook, just like other Data Economy giants, happens to delocalize filtering to an army of moderators working in companies from the Middle East to South Asia, as Facebook’s Monika Bickert confirmed.

These moderators may have a hand on establishing what should be expunged from the social network, according to what customers may signal. But the information is then compared to an algorithm, which comes up with the final decision.

It doesn’t take a PhD to note these moderators may not exactly excel in cultural competence, or are capable of analyzing context. Not to mention algorithms – which are incapable of “understanding” cultural context and are certainly not programmed to interpret irony, sarcasm or cultural metaphors.

Algorithms are literal. In a nutshell; algorithm is an opinion wrapped up in code.

And yet we are now reaching a stage where a machine decides what is news. Facebook for instance now relies solely on an algorithm to establish which stories to position in its Trending Topics section.

There may be an upside to this trend – as in Facebook, Google and YouTube using systems to quickly block Daesh videos and similar jihadi propaganda. Soon eGLYPH will be in effect – a system similar to Content ID on YouTube which censors videos that violate author’s rights using “hashing”; video and audio signaled as “extremist” will be assigned a unique footprint, allowing automatic removal of any new version and blocking any new uploading.

And that will bring us to even murkier territory; the very concept of “extremism” itself. And the effects on all of us of self-censorship systems based on algorithmic logic.

How WMDs run our life

It’s under this spectrum that a book such as Weapons of Math Destruction, by Cathy O’Neil (Crown Publishing) becomes as essential as the air that we breathe.

O’Neil is the real deal; PhD in Math in Harvard, former professor at Barnard College, former quant at a hedge fund before reconverting as a data scientist, and a blogger at mathbabe.org.

Mathematical models are the engines of our digital economy. That propels O’Neil to formulate her two critical insights – which may startle legions who regard machines as simply ‘neutral’.

1) “Math-powered applications powering the data economy [are] based on choices made by fallible human beings”.

2) “These mathematical models [are] opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, [are] beyond dispute or appeal. And they tend to punish the poor and the oppressed in our society, while making the rich richer”.

Thus, O’Neil’s concept of Weapons of Math Destruction (WMDs); or how destructive math models are accelerating a social earthquake.

O’Neil extensively details how destructive math models now micromanage vast swathes of the real economy, from advertising to the prison system, not to mention the finance industry (as in all the after effects of the never-ending 2008 crisis).

These math models are essentially opaque; unaccountable; and target above all “optimization” of the (consuming) masses.

A golden rule is – what else – to follow the money. As O’Neil puts it,

for “the people running the WMDs”, their “feedback is money”; “the systems are engineered to gobble up more data and fine-tune their analytics so more money will pour in».

Victims – as in Obama administration drone strikes – are mere “collateral damage”.

Parallels between the finance casino and Big Data are inevitable – and it helps that O’Neil worked on both industries. This is something I examined in a column on how.

Silicon Valley follows the money. We see the same talent pool of elite US universities (MIT, Stanford, Princeton), the same obsession of doing whatever it takes to rake more cold hard cash for the outfit that employs them.

WMDs favor efficiency. “Fairness” is a mere concept. Computers don’t understand concepts. Programmers don’t know how to code concept – as we saw it in the «napalm girl» story. And they also don’t know how to adjust algorithms to reflect fairness.

What we do have is the concept of “friendship” being measured by likes and connections on Facebook. O’Neil sums it all up; “If you think of a WMD as a factory, unfairness is the black stuff belching out of the smoke stacks. It’s a emission, a toxic one”.

Gimme cash flow, now

In the end, it’s the Goddess of the Market that rules it all – prizing efficiency, growth and endless cash flow.

Even before the “napalm girl» fiasco, O’Neil had made the crucial point that Facebook actually determines, according to its own interests, what everyone sees – and learns – in the social network. No less than two-thirds of American adults have a Facebook profile”. Nearly half, according to a Pew Research Center report, rely on Facebook for at least some of their news.

Most Americans – not to mention most of Facebook’s 1.7 billion users around the world – ignore that Facebook tinkers with the news feed; people actually believe that the system instantly shares anything that is posted with their community of friends.

Which brings us, once again, to the key question in the news front. By tweaking its algorithm to model the news people see, Facebook now has all it takes to game the whole political system. As O’Neil notes, “Facebook, Google, Apple, Microsoft, Amazon have vast information on much of humanity – and the means to steer us in any way they choose”.

Their algorithms, of course, are strategically priceless; ultimate, non-transparent, trade secrets; “They carry out their business in the dark”.

In his recent, much publicized trip to Rome, Mark Zuckerberg said that Facebook is “a high-tech company, not a news company”. Well, not really. The most intriguing aspect of the “napalm girl” fiasco may be the fact that Shibsted, the Scandinavian media group, is planning huge investments to create a new social forum and defy – who else – Facebook. Get ready for a brand new war in the WMD front.

This piece first appeared at Strategic Culture Foundation

Pepe Escobar is the author of Globalistan: How the Globalized World is Dissolving into Liquid War (Nimble Books, 2007), Red Zone Blues: a snapshot of Baghdad during the surge and Obama does Globalistan (Nimble Books, 2009).  His latest book is Empire of ChaosHe may be reached at pepeasia@yahoo.com.