FacebookTwitterGoogle+RedditEmail

The Dystopian Future of Facebook

Photo Source thierry ehrmann | CC BY 2.0

This year Facebook filed two very interesting patents in the US. One was a patent for emotion recognition technology; which recognises human emotions through facial expressions and so can therefore assess what mood we are in at any given time-happy or anxious for example. This can be done either by a webcam or through a phone cam. The technology is relatively straight forward. Artificially intelligent driven algorithms analyses and then deciphers facial expressions, it then matches the duration and intensity of the expression with a corresponding emotion. Take contempt for example. Measured by a range of values from 0 to 100, an expression of contempt could be measured by a smirking smile, a furrowed brow and a wrinkled nose. An emotion can then be extrapolated from the data linking it to your dominant personality traits: openness, introverted, neurotic, say.

The accuracy of the match may not be perfect, its always good to be sceptical about what is being claimed, but as AI (Artificial Intelligence) learns exponentially and the technology gets much better; it is already much, much quicker than human intelligence.

Recently at Columbia University a competition was set up between human lawyers and their AI counterparts. Both read a series of non-disclosure agreements with loopholes in them. AI found 95% compared to 88% by humans. The human lawyers took 90 minutes to read them; AI took 22 seconds. More incredibly still, last year Google’s AlphaZero beat Stockfish 8 in chess. Stockfish 8 is an open-sourced chess engine with access to centuries of human chess experience. Yet AlphaZero taught itself using machine learning principles, free of human instruction, beating Stockfish 8 28 times and drawing 72 out of 100. It took AlphaZero four hours to independently teach itself chess. Four hours from blank slate to genius.

A common misconception about algorithms is that they can be easily controlled, rather they can learn, change and run themselves-a process known as deep “neural” learning. In other words, they run on self-improving feed back loops. Much of this is positive of course, unthought of solutions by humans to collective problems like climate change are more possible in the future. The social payoffs could be huge too. But what of the use of AI for other means more nefarious. What if, as Yuval Noah Hariri says, AI becomes just another tool to be used by elites to consolidate their power even further in the 21stcentury. History teaches us that it isn’t luddite to ask this question, nor is it merely indulging in catastrophic thinking about the future. Rapidly evolving technology ending up in the hands of just a few mega companies, unregulated and uncontrolled, should seriously concern us all.

Algorithms, as Jamie Bartlett the author of The People Vs Tech puts it, are “the keys to the magic kingdom” of understanding deep seated human psychology: they filter, predict, correlate, target & learn. They also manipulate. We would be naive in the extreme to think they already don’t, and even more naive to think the manipulation is done only by commercial entities. After all, it’s not as if there aren’t lots of online tribes, some manufactured and some not, to be manipulated into and out of political viewpoints, our fleeced of their money.

In 2017 Facebook said they could detect teenagers’ moods and emotions such as feeling nervous and insecure by their entries, a claim they denied later, adding we do not, “offer tools to target people based on their emotional state”. The internal report was written by two Australian executives-Andy Sinn and David Fernandez. The report according to The Guardian was written for a large bank and said that, “the company has a database of its young users – 1.9 million high schoolers, 1.5 million tertiary students and 3 million young workers”.

Going one better still, Affectiva, a Boston company, claims to be able to detect and decode complex emotional and cognitive data from your face, voice and physiological state using emotion recognition technology (ECT)-amassing 12 billion “emotion data points” across gender, age & ethnicity.  Its founder has declared that Affectiva’s ECT can read your heart rate from a webcam without the you wearing any sensors, simply by using the reflection of your face which highlights blood flow-a reflection of your blood pressure. Next time you’re listening to Newstalk’s breakfast show, think of that.

Affectiva’s ultimate goal of course, when you get past all the feel-good optimistic guff about “social connectivity”, “awesome innovation”, and worst of all “empowering” is, to use their own words, to “enable media creators to optimize their content”. Profiting from decoding our emotional states in other words.

Maybe Facebook (and Google) would use this technology wisely for our benefit, then again maybe not. It isn’t such a stretch to imagine how it could be used unethically too. To microtarget customised ads and messages at us depending on our state of mind at given time, say, and allowing Cambridge Analytica to harvest the personal data of 87 million Facebook users to subvert democracy with Brexit & Trump. Facebook claims they weren’t aware of this though.  Well, maybe, maybe not, and in spite of their protests in recent years they are still not especially transparent or accountable given their enormous cultural and social power in our lives. Curiouser and Curiouser you might think, and you’d be right.

The second Facebook patent is even more interesting, if that’s the right word, or dystopian if you prefer. Patented this June, published under the code US20180167677 (with the abstract title of Broadcast Content View Analysis Based on Ambient Audio Recording, application no: 15/376,515) illustrates a process by which secret messages- ‘ambient audio fingerprints’ in the jargon-embedded in TV ads, would trigger your smart technology (phone or TV) to record you while the ad was playing. Presumably to gauge your reaction to the product being advertised at you through, perhaps, voice biometrics (i.e. the identification and recognition of the pitch and tone of your voice).

As the patent explains in near impenetrable but just about understandable jargon this is done by first, detecting one or more broadcasting signals (the advertisement) of a content item. Second, ambient audio of the content item is recorded, and then the audio feature is extracted “from the recorded ambient audio to generate an ambient fingerprint” and finally, wait for it, “ the ambient audio fingerprint, time information of the recorded ambient audio, and an identifier of an individual associated with a client device (you and your phone or smart TV) recording the ambient audio” is sent, “to an online system for determining whether there was an impression of the content by the individual.” It goes on to say that “the impression of the identified content item by the identified individual” is logged in a “data store of the online system”.

It goes on to state that “content providers have a vested interest in knowing who have listened and/or viewed their content” and that the feature described in the patent are not exhaustive, and that “many additional features and advantages will be apparent to one of ordinary skill in the art…”.

It is already obvious we don’t know how much Facebook and other big tech platforms monitor us, neither do we know how much data they hold on us individually and collectively and, critically, who has access to that data and how they could use it.

If you can sell consumer goods by such manipulation why not whole ideologies, chipping away at our human agency one dystopian tech innovation at a time, paving the way for the morphing of late stage capitalism into authoritarian capitalism; one efficiency gain at a time.

If put into place such “innovations” are designed to monitor our emotional states for monetary gain. In essence, it is a type of online mood tracking where we are the digital lab rats.  Facebook is already valued at half a trillion US dollars giving it huge economic and cultural power.

According to Private Eye magazine, Facebook’s legal team say the patent was filed “to prevent aggression from other companies”, and that “patents tend to focus on future-looking technology that is often speculative in nature and could be commercialised by other companies”.    As Private Eye pointed out though, it’s not as if Facebook has been completely transparent about such secretive issues in the past or present. The fact that Facebook generates billions by manipulating our emotions is not a surprise us, their business model is based on it, but how they intend to do it in the future should surprise, and alert us. We are after all the product. Over 90% of their revenues comes from selling adverts. They have the market incentive.

How will all this play out in the future? It isn’t difficult to build a picture of a commercialised and rapacious big tech dystopia, the very opposite of the freedoms and civil liberties envisaged by the original pioneers of the internet, and the opposite of how they currently perceive themselves.

Verint, a leading multinational analytics & biometric corporation, with an office in Ireland, has been known to install and sell, “intrusive mass surveillance systems worldwide including to authoritarian governments”, according to Privacy International. Governments that routinely commit human rights abuses on their own citizens.

China, a world leader in surveillance capitalism, recently declared that by 2020 a national video surveillance network, Xueliang, will be fully operationable, Sharp Eyes in English-Kafka and Orwell must be smirking knowingly somewhere. The term sharp eyes harks back to the post war slogan in communist China of “The people have sharp eyes”, when neighbours were encouraged to spy and tell on other neighbours of counter revolutionary or defeatist gossip about the 1949 revolution.

Democracies too have built overarching systems of surveillance. Edward Snowden told us in 2013 that the NSA was given secret direct access to the servers of big tech companies (Facebook, YouTube, Google and others) to collect private communications. As Glenn Greenwald said, the NSA’s unofficial “motto of omniscience” is: Know it all, Collect it all, Process it all.

Jaron Lanier, pioneer of virtual reality technology and a tech renegade, and an apostate to some, recently called the likes of Facebook and Google “behaviour manipulation empires”. Their pervasive surveillance and subtle manipulation through “weaponised advertising” he argues debases democracy by polarising debate at a scale unthinkable even just five or ten years ago, and it’s not only advertising that can be weaponised. Facebook, Google, Twitter and Instagram all have “manipulation engines” (algorithms we know little about) running in the background Lanier says, designed specifically by thousands of psychological & “emotional engineers” (“choice architects” or “product philosophers” to use the inane corporate gobbledygook). Their job is to keep you addicted to what’s now known as the “attention economy”-and attention equals profit. A better description still might be the attention/anxiety economy. Twitter has for instance a 3 second time delay between the page loading and notification loading, Facebook something similar-and always red for urgent. They are known in psychology as intermittent variable rewards, negative reinforcement in this context which keep behaviour going by the hope of maybebeing rewarded, with a like or a follower. This builds anticipation and releases feel good neurotransmitters, and taps into your need to belong, and to be heard-we’re intensely social creatures. The downside is the opposite of course,where we can be thrown into an emotional rollercoaster if the expected dopamine hit doesn’t come.

The goal is addiction into a consumption frenzy of socially approved validation. Big Tech’s social media universe is, as one reformed “choice architect” put it, “an attention seeking gravitational wormhole” that sucks you into their profit seeking universe. If you don’t think so, check how many times you look at your phone every day. The average person checks 150 times. Most of that is social media. We’re all in an attention arms race now.

There is a great German word: Zukunftsangst. It means translated, roughly, future-anxiety. Maybe it should be renamed Zuckerbergangst instead.

More articles by:

Mark Kernan is a Freelance Writer and Independent Researcher. Follow him @markkernan1

February 20, 2019
Anthony DiMaggio
Withdrawal Pains and Syrian Civil War: An Analysis of U.S. Media Discourse
Charles Pierson
When Saudi Arabia Gets the Bomb
Doug Johnson Hatlem
“Electability” is Real (Unless Married with the Junk Science of Ideological Spectrum Analysis)
Kenneth Surin
The Atlantic Coast Pipeline: Another Boondoggle in Virginia
John Feffer
The Psychology of the Wall
Dean Baker
Modern Monetary Theory and Taxing the Rich
Russell Mokhiber
Citizens Arrested Calling Out Manchin on Rockwool
George Ochenski
Unconstitutional Power Grabs
Michael T. Klare
War With China? It’s Already Under Way
Thomas Knapp
The Real Emergency Isn’t About the Wall, It’s About the Separation of Powers
Manuel García, Jr.
Two Worlds
Daniel Warner
The Martin Ennals and Victorian Prize Winners Contrast with Australia’s Policies against Human Dignity
Norman Solomon
What the Bernie Sanders 2020 Campaign Means for Progressives
Dan Corjescu
2020 Vision: A Strategy of Courage
Matthew Johnson
Why Protest Trump When We Can Impeach Him?
William A. Cohn
Something New and Something Old: a Story Still Being Told
Bill Martin
The Fourth Hypothesis: the Present Juncture of the Trump Clarification and the Watershed Moment on the Washington Mall
February 19, 2019
Richard Falk – Daniel Falcone
Troublesome Possibilities: The Left and Tulsi Gabbard
Patrick Cockburn
She Didn’t Start the Fire: Why Attack the ISIS Bride?
Evaggelos Vallianatos
Literature and Theater During War: Why Euripides Still Matters
Maximilian Werner
The Night of Terror: Wyoming Game and Fish’s Latest Attempt to Close the Book on the Mark Uptain Tragedy
Conn Hallinan
Erdogan is Destined for Another Rebuke in Turkey
Nyla Ali Khan
Politics of Jammu and Kashmir: The Only Viable Way is Forward
Mark Ashwill
On the Outside Looking In: an American in Vietnam
Joyce Nelson
Sir Richard Branson’s Venezuelan-Border PR Stunt
Ron Jacobs
Day of Remembrance and the Music of Anthony Brown        
Cesar Chelala
Women’s Critical Role in Saving the Environment
February 18, 2019
Paul Street
31 Actual National Emergencies
Robert Fisk
What Happened to the Remains of Khashoggi’s Predecessor?
David Mattson
When Grizzly Bears Go Bad: Constructions of Victimhood and Blame
Julian Vigo
USMCA’s Outsourcing of Free Speech to Big Tech
George Wuerthner
How the BLM Serves the West’s Welfare Ranchers
Christopher Fons
The Crimes of Elliot Abrams
Thomas Knapp
The First Rule of AIPAC Is: You Do Not Talk about AIPAC
Mitchel Cohen
A Tale of Two Citations: Rachel Carson’s “Silent Spring” and Michael Harrington’s “The Other America”
Jake Johnston
Haiti and the Collapse of a Political and Economic System
Dave Lindorff
It’s Not Just Trump and the Republicans
Laura Flanders
An End to Amazon’s Two-Bit Romance. No Low-Rent Rendezvous.
Patrick Walker
Venezuelan Coup Democrats Vomit on Green New Deal
Natalie Dowzicky
The Millennial Generation Will Tear Down Trump’s Wall
Nick Licata
Of Stress and Inequality
Joseph G. Ramsey
Waking Up on President’s Day During the Reign of Donald Trump
Elliot Sperber
Greater Than Food
Weekend Edition
February 15, 2019
Friday - Sunday
Matthew Hoh
Time for Peace in Afghanistan and an End to the Lies
Chris Floyd
Pence and the Benjamins: An Eternity of Anti-Semitism
FacebookTwitterGoogle+RedditEmail