- CounterPunch.org - https://www.counterpunch.org -

Algorithmic Dictatorship

Source: author’s AI-based creation using crAIyon.com/.

Digital dictatorship marks the combination of algorithms, the invisible computer codes that increasingly run the Internet and by inference, our lives. While lacking a distinct ideological backing, it also carries mild connotations to the Italian fasci and fascio.

Literally, fasci means bundle or sheaf, and figuratively league – the league of the Fasci Italiani di Combattimento. We know them as Fascists. Yet, our increasingly digitalized world has the potential to mutate into a world of extreme authoritarianism. This world may even show associations to, what the eminent Henry Giroux calls, The Culture of Fascism.

Every day, algorithm, digital, and Internet-based computer networks allow for a smooth and the seemingly all-pervasiveness of cultureof global mass surveillance run by governments and corporate entities. Both are united in the common interest of creating diligent followers – diligent voters, and diligent employees – asphyxiated inside an ever encroaching regime of a machine-based digital dictatorship or ideology-based Algo-Fascism.

In this, the ammunition for this mass surveillance is supplied by our daily Internet use, the secret algorithms that run their eternal script while shaping our Internet use, habits, likes, and our thinking. It is run by a handful of global GFMA corporations (Google (Alphabet), Apple, Facebook (Meta), Amazon, and Microsoft) that are only too happy to exploit all this for a handsome profit.

Digital dictatorship and Algo-Fasci have merged the Roman panem et circenses or bread and circus. Today, this extends from what the global mass media present, as well as the accompanying neoliberal ideology onto the Internet. It culminates in the bread and circus of mindless entertainment.

It has merged both into a computer screen that offers amusement and access to a sheer and endless amount of information and disinformation but also, in return, the creepy entrapment of the user inside the corporate, but very seductive, system of algorithmic fascism and digital dictatorship.

Perhaps, the world of an algorithmic dictatorship is even worse than what was predicted by Lewis Mumford in his seminal work on the megamachine and as outlined by Weiner as techno-feedback-loops.

Today’s algorithms have the true potential to diminish, if not eliminate, the human dimensions of life. Collectively, we stand to lose ground to the calculative efficiencies and Uber-performances of algorithms. Their superior efficiencies and performances work in service of the governments, capitalism, and even for criminal networks.

It stretches to rentier capitalism’s global financial system that has long merged neoliberal gangster capitalism with the money laundering, stealthy off-shore banking, and exotic accounting – as it is known. The infamous Panama Papers are all but one, albeit microscopic, episode of the global merger between rentier capitalism and financial criminality.

Beyond all this, artificial intelligence (AI) -driven algorithms have already established new patterns of what we perceive as good and bad. They have created a world beyond a human-generated morality. They have, as Nietzsche would say, gone Beyond Good and Evil that now comes with a coded script.

What Nietzsche had in mind – a new definition of good and evil – and what digital dictatorship does today is the creation of a new threshold of what we understand as normality (tax havens) and abnormality (striking nurses). Algorithms even tell the police what parts of a city must be policed by defining which geographical areas in a city is “normal” or “abnormal”.

Simultaneously, algorithms shape actions and calibrate responses to, for example, a crime. Increasingly, they also define what – “we” no longer – define as crime but, for example, an algorithm-driven mini-computer linked to a powerful algorithmic system screwed into a police car that – hopefully does not – drive down your local street.

Just as we can no longer imagine the end of capitalism but can imagine the end of the world, we can no longer imagine life without the Internet, email, Facebook, LinkedIn, and for some: tinder and match.

Even more so and particularly since the recent advent of ChatGPT, we can effortlessly imagine – and access – a post-human culturewhere engagement, debate, discussions, and discourse circulate ideas without any need for an actual and real human author.

We have – effectively – handed over control to algorithms for easy school essay writing, the composition of newspaper articles, songs, and pictures using, for example, www.craiyon.com.

Smoke screened by the ideology of neoliberalism’s individualism and personal choice, now it is the algorithm that determines your preferences, your likes and dislikes, and, worse, they even steer you toward ever more products which we do not need to be bought with money we don’t have to impress people we do not even like.

Ultimately and this is one of the main motivations for corporate algorithms, they entice us – furnished by rafts of behavioral psychologists working in the background and by what is euphemistically called as persuasion technology – to buy the “right” things. These are things that can be turned into a profit.

The commodification of user data – our data – isn’t an entirely new phenomenon. As early as the 1980s, big data group LexisNexissold data. Undeterred by the selling of even the most intimate data – ordering condoms online, for example – our increasingly very personal technological devices offer many people some sort of algorithmic conveniences.

In the wake of this, society as a whole and we as individuals, as it seems, are slowly but surely losing our common standards and even our shared concerns regarding surveillance – online, and otherwise.

Perhaps it still remains a rather forceful argument that our common rejection of online surveillance is not borne out of fear that we, as it is called, have something to hide, but rather our fear that there is no longer a safe place to hide on the Internet.

Such a very basic and very human assumption very quickly leads into a Kafkaesque nightmare. And worse, if one does not truly know what an algorithm is programmed to do, chances are that this algorithm is programming us. Such an algorithm can be used to produce something that stimulates human to share things through arousing excitement, panic, rage, and other intensified emotional response.

Some algorithms have been engineered – whether intentionally or not – to infect the human mind and then to turn that person into a replicator of that virus, a specific idea, a product to be sold, or a certain ideology like, for example, neoliberalism, Trumpism, right-wing populists, racism, perhaps even fascism.

Whether inside online silos, chat rooms, and echo-chambers, the engineering of algorithms comes with a rather widespread practice of fragmentary, impersonal communication. Increasingly, this demeans interpersonal interaction and, worse, shifts human collaboration onto non-personal interactions designed to take place between a human and an algorithm.

Over time, it is very likely that algorithms will shape a new generation of people who will be normalized, and worse will be adopted, adjusted – some might say manipulated – to impersonality. In the end, this sort of fragmented communication will be an AI-guidedform of interaction.

Worse, this generation will also be accustomed to digital technologies that facilitate a reduction of human expectation – particularly of what a human person is, what a human being can be, should be, and Kant’s ought to be.

It also manipulates how each person can become a mature person while making sure that most people no longer reach what German philosophers Kant and Adorno call Mündigkeit – the mature, self-reflective, self-determined, and critical human being. Eliminating this, remains a vital step to digital dictatorship.

The increased reliance on AI-driven algorithms can be a very unforgiving world – the world of digital dictatorship – with extremely serious consequences related to social isolation, individual alienation, mass-anxiety, the ever increasing global pandemic of depression, widespread self-harm, suicide, and the unquestioned compliance to authoritarianism.

Beyond those pathologies and perhaps even functioning as an early signifier of a looming authoritarianism, a fully-fledged digital dictatorship would only exasperate these pathologies.

One tiny and early signifier of a potential march towards digital dictatorship is the mind-numbing use of Twitter which is, by default, structurally ill-equipped to handle complex content, thinking, and expression simply because of its inherent character limits. And that is not even considering the impact of algorithms on what philosophy calls personhood.

Once combined with the thought-destructive and depersonalized features of, for example, Twitter along with its simplicity and impulsivity, it does not come as a surprise to find algorithm-driven intolerant, degrading, aggressive, insulting forms of discourse.

This will not just undermine online civility and eventually societal civility, but it is yet another element that will also drive society towards digital dictatorship.

On the upswing, there is still the fact that, particularly in an increasingly complex informational society, the demands for cognitive functioning are growing steadily and perhaps even exponentially. On the downside, a digital dictatorship does not depend on cognitive functioning – it depends on the opposite. We can see this by looking at 20th century history.

Wasn’t it the highly developed, very cultural, and scientific Germany that mutated into Nazism during the 1930s? In other words, neither Italian fascism of the 1920s, nor its German version in the 1930s, nor the digital dictatorship of the future depends on cognitive functioning – it depends on the mass that can be manipulated – whether online or elsewhere.

To overcome the limits of present cognitive functioning, people started to experiment with strategies to conquer these natural limitations using, mainly, three main strategies: bio-chemical, physical, and behavioral enhancement.

The next level may well be neurosciences creating what is known as cyborgs – a hybrid of mechanical cybernetic and a human organism. This creates a human/machine being with both organic and biomechatronic body parts.

This is already happeneing and it occurs without even thinking of terminator-style futuristic killer robots and the not so futuristic Boston-Dynamic-style killer robots like automated killer drones flying over the Ukraine in 2023.

Beyond such AI-guided killer robots, artificial intelligence is inserting itself into very old and critical notions about human beings and our place in the technological infrastructure of the future.

With diminished agency, the handing over of this place to AI may well lead to digital dictatorship. In the sociologist dictum of agency-vs.-structure, it is a move from the human agency toward the algorithmic structure in which the structure determines the agency. At that point, a digital dictatorship becomes a reality.

That also marks the precise point in the development towards digital dictatorship where another shift occurs. At that point, the framing of AI-based algorithms can become the problem. Yet we are told that algorithms are the solution “for” human beings. Meanwhile, the algorithm frames human beings as the problem for technology.

What if it is no longer our mobile phones, our computers, and the software, but our bodies and our brains that quickly become out of date? And what if these become effectively obsolete?

At the eve of the arrival of digital dictatorship, we may well live at the end of the very last cycle of technological development led entirely by humans. As our existence becomes ever more trapped inside the seemingly useful computers, algorithmic machines, and AI, we continue to – step-by-step – delegate more and more human decision making and problem solving to them.

We not just diminish human agency, but we will be using human memory less and less. Machines, databanks, hard drives, the cloud, algorithmic-guided computer systems, now store most of our knowledge.  Simultaneously, we will be handing over personal and even rather more intimate information to AI systems.

And we will do that without acknowledging how much power we are giving to the algorithms that – in turn and slowly but surely – will not just only predict but also guide our decisions, our behaviors, and even define our place in a society. All this and much more will be defined by digital dictatorship.

As algorithms are furnished with decision-making powers, these machines are developed to learn through artificial intelligence algorithms. Yet, this is another key in the move of our society towards digital dictatorship. These self-learning entities function – and this is yet another vital component of digital dictatorship – without having to be reprogrammed by humans.

Under digital dictatorship, they will have learned to act without human control. On the way towards digital dictatorship, the infamous singularity – a hypothetical future where technology growth is out of control and irreversible – marks a significant road sign. Digital dictatorship means the control of human society by self-learning artificial intelligence algorithms.

Under digital dictatorship, the question of who is serving whom? has been solved in favor of self-learning artificial intelligence algorithms. And here comes the never asked question on the dawn of digital dictatorship:

Just because AI technology and algorithms facilitate the next step of human society’s move towards digital dictatorship – camouflaged as efficiency, speed, advancement, global betterment, and productivity – should human being almost automatically jump onto the bandwagon of so-called “technical progress” while being seated in a high-speed train that no longer knows how to stop, to reflect, to contemplate and be part of humanity?

It might already be way too late to be asking such questions since the race towards digital dictatorship has already started.