The Metaphysics of Memory

Memory – René Magritte, Painting (1948)

It takes strength to remember, it takes another kind of strength to forget, it takes a hero to do both. People who remember court madness through pain, the pain of the perpetually recurring death of their innocence; people who forget court another kind of madness, the madness of the denial of pain and the hatred of innocence; and the world is mostly divided between madmen who remember and madmen who forget. Heroes are rare.

– James Baldwin

There are certain modern myths which gain credence and currency because they are rendered in a ‘scientific’ tone and language.  One such myth is that of the photographic memory.   As the name suggests, this refers to a person who can recall a past scene with all the accuracy of a photographic image.  Such memories neither fade nor fail and their crystalline clarity means they can be examined at will in the same way one might upload a digital image that has lost none of is clarity or lustre even if viewed many years after.

The idea of the human mind as operating akin to a machine, as a recording device with a given amount of storage space, is a belief which only comes into its own in the 19th century, at the time of the industrial revolution when those ‘dark satanic mills’ were springing up in and around the great cities; such a doctrine takes shape in a society where technological production has been ratcheted up to its zenith, its product effectively measured and quantified according to the relentless rhythms of the conveyer belt, and where human labour itself has been inexorably fused with the pistons and levers of the factory monolith.

Perhaps the first human ‘thinking machine’ was Conan Doyle’s fictional detective Sherlock Holmes who appeared in the 1880s, a product of the Victorian epoch. Conan Doyle was at pains to emphasise not only the brilliance of his seminal detective but also the way in which Holmes’ mind had a given quantitative capacity which could be tweaked and refined in much the way one might refine a sophisticated piece of machinery in order to yield its optimum capacity.  Doyle has his eponymous detective describe how:

 a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort … the skilful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order … there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.[1]

In our own period, in the epoch of the microchip and the smart phone – when one can tap a small, limpid screen and within seconds access the vast stock of the human knowledge accrued over millennia – the ‘Holmes’ archetype has become ever more technologized.  In the recent twenty-first century incarnation where the genius detective is played by Benedict Cumberbatch, Holmes is revealed to have a photographic memory; specifically he possesses a ‘mind palace’ which he can retreat into in order to recall the smallest of facts or details buried in the subconscious of his awesome mental landscape. We see him navigate this realm through a series of fantasy emblems which very much resemble the digital icons one might see on the screen of a smart phone or computer.

The idea of the photographic memory, therefore, is linked to a broader notion of genius in an age in which human production and human potential is understood in accordance with forms of mechanical productivity which can be strictly quantified according to specific measurements whether in terms of physical output or monetary value.  Indeed, the development of computers has led to a situation where ‘memory’ itself is conceived in purely quantitative terms; i.e. the device is described as having a ‘memory’ which can be measured by way of gigabytes.

The ‘photographic’ memory is in some sense the reflection of this; the genius of the most brilliant of individuals is more and more conceived in machine-like terms (camera/computer); i.e. is understood according to a tangible remit to be expressed in purely quantified and mechanical terms – the speed at which one recalls, for instance, or the amount of information which can be recalled.

To borrow from the beloved British sitcom Blackadder, however, there is one little flaw in this bold and creative conception.  It happens to be bollocks.   For a start, there has never been a proven case.  The closest anyone has come to successfully claiming a photographic memory was in the case of a Harvard student, Elizabeth Stromeyer.  She was tested by a scientist who placed before her left eye a collection of 10,000 dots, before showing her right eye another collection of 10,000 dots.  According to the scientist in question, Elizabeth was able to form a three dimensional image of all the dots in all their positions from both pictures melded together.  However, having run the experiment, the scientist in question (Charles Stromeyer III) proceeded to marry his subject, and they both refused any further testing. His claims have been allowed to linger, therefore, but forever unsubstantiated.

Another claim to the power of the photographic memory was a historical one; that is to say, the Shas Pollaks, a set of Talmud scholars, were tested in the early part of the twentieth century and were able to give the precise locations of certain words on the pages of twelve books.   But as the journalist Joshua Foer writes for Slate Magazine, this was not so much to do to with ‘photographic memory’ bur rather ‘heroic perseverance’, for if the ‘average person decided he was going to dedicate his entire life to memorizing 5,422 pages of text, he’d probably also be pretty good at it. It’s an impressive feat of single-mindedness, not of memory.’[2]

There are, of course, people who have extremely good memories, such as the memory savant Jill Price who was able to recall, decades later, the precise date the TV series ‘Mash’ went off air and was even able to give an account of the weather that same day.  But while Price’s memory is undoubtedly prestigious, a more in depth test – in which she was asked to recall a long list of words verbatim – showed her missing several key words.  In other words, her memory didn’t ‘offer instantaneous recall of discrete, fine-toothed details.’[3]  It wasn’t ‘photographic’.

My suspicion is that a good deal of these types of people fetishize notions of intelligence and memory, partly because they tend to come from a social strata which is always seeking to justify its own power and prestige in terms of an innate and elite set of intellectual gifts with which the individual mind has been blessed – rather than the set of privileges and advantages that same ruling elite inherits by virtue of its social and economic position.

In this way, much like those parents who ensure that their children have private tuition which focusses on learning techniques to attain high scores in IQ tests, so too are some people driven to develop ‘photographic’ memories in and through intense acts of training and discipline – because the ‘high IQ’ or the ‘photographic memory’ denotes the means by which their social superiority can be enshrined in terms of clinical ‘scientific’ fact.

The endeavour to train themselves in accordance with an archetype of ‘genius’ corresponds to an epoch in human history where everything has been rationalised and quantified, measured according to the metrics of dollar and cent, and where accumulation is the buzzword.  Just as one’s fortune can be objectively gauged by the millions upon millions sequestered in bank accounts, so too can one’s intelligence be quantified by the vast levels of IQ points one has stored in one’s brain, so too can one’s memory be described in the quantified and delineated terms of a mechanical device, a high-end camera with a vast and precise capacity to record every image its flash illuminates.

Of course, the photographic memory is just as much a fiction as the concept of IQ.  But if it was possible for a human individual to possess a photographic memory; to be able to recall all the details of anything your mind had recorded – what would such a condition mean?  Would it be wholly desirable?

From the point of view of passing exams, of recalling information in an instant without having to resort to Wikipedia, remembering people’s names at parties and never forgetting an anniversary, one would imagine it would be a pretty useful thing.  In terms of being able to relieve the most perfectly happy moments of your life in glorious technicolour, and in real time, again the advantages are palpable.

But the same logic would also underwrite graphic disadvantages.  If one could recall a car accident many years later in the same warping, scorched and graphic detail as the moment of the collision, the nightmares might never fade.   Or what if one couldn’t avoid recalling, with sharp and real-time clarity, the expression on the face of a loved one lying in a hospital bed as they went through the last throes of an excruciating terminal cancer?  Would the sharpness of all-consuming grief ever have the chance to meld into the softer and more gentle melancholy of remembrance?

The issue, however, is not simply about these more practical considerations – the benefits or damages wrought to the psyche.  The more significant issue is the means by which memory operates at the most fundamental level.   Our consciousness, in assimilating the details of the present moment, is always separating the wheat from the chaff; that is, it is always focusing on the important details – such as the words you are reading, while relegating to the background information about the colour of the page for example, or the type of font the text employs.   The great horror writer Stephen King described those details which the mind is aware of but does not bring into conscious focus as belonging to the realm of the ‘subaudible’.

It is a neat expression which hints at a more profound truth; that is the process of creating thoughts, of learning, is a process of abstraction; i.e. the mind is constantly cleaving away unessential details in order to fix upon some essential characteristic; and that such discrimination is in the nature of memory itself.  More often than not, memories are the process by which an essential thought or event from the past is excavated; in bringing it to light from the purview of the future, the memory itself is developed; some details assume a new and vital significance because they have been revealed in the context of new horizons, while other more insignificant details fade away in the passage of time.

In other words, a memory should not be conceived of as a single entity, a snapshot frozen in eternity; rather any single, specific memory forms its own mental chain; each time you recall it, it is changed and shaped organically – your mind brings new details to the fore while exiling others in accordance with your changing life experience.  What you remember is no longer the thing in itself – the original memory – but rather the memory of the memory of the memory.

Any single memory, therefore, cannot be conceived of in terms of a photograph – because as an organic process rather than a static thing, its ability to shed certain details while deepening and ripening others, is what ultimately allows it to reach fruition in our minds.  And, of course, our mental landscape is crisscrossed by these memory-forming chains – some of these are excluded as time passes, some of these are forgotten – so that those memories most pertinent to who we are and the experiences which have shaped us most fundamentally are pulled to the fore and better retained.

Or to say the same, remembering simultaneously necessitates a process of forgetting.  A camera can record an image but it cannot remember it, because the images the camera records are fixed, perfect, but mutually exclusive photographs, which stand in discrete and absolute indifference to one another; whereas an important function of the human memory lies in its ‘imperfection’ – i.e. the way in which ‘recall’ transforms and distorts the original memory, melds it with others, and in the process the more inessential elements perish while more fundamental ones are preserved.

But it is not just individuals who remember and who forget.  Sometimes it is epochs themselves.  For what else is a dark age, apart from a lapse, a forgoing of memory?  A period of time in which generations of people lose the ability to labour in certain specialised ways; forgotten are the talents and techniques of the artisan, the cobbler, the smith; the ability to smelt or to stich more complex linings and intricate seams is forgone – it fades into the past – and in the midst of such collective amnesia vast swathes of people abandon the cities hitherto great hubs of commercial activity, falling back, returning to the land once more, retreating into the bygone cycles of hoeing  or simple ploughing which generations millennia before had first taken up.

And of all the skills, trades and techniques which grow dim and fade in such an epochal darkness, surely it is the loss of the capacity to write which presents the most grievous loss of all – for that involves, quite literally, the loss of memory; the loss of the ability to preserve and consecrate the events of the past so that they might once again bloom under the light and scrutiny of new eyes.

At around 1100 BC several civilizations on the European continent slipped into exactly this type of darkness.  The Egyptians, the Hittites, the Mycenaeans to name a few.   No one knows exactly why these civilizations, quivering on the edge of the precipice, finally slipped into the darkness.  Some historians and anthropologists have posited a great cataclysmic event, flood or volcano – take your pick – while others posit the great migrations of ‘sea peoples’, barbarians from the south or from the east whose marauding hordes were too strong for the walls of civilization to withstand.

Still more (and in the current writer’s view this provides the definitive explanation) suggest that the great elites of these centuries old civilizations had grown bloated and decadent, that the bronze age economy was only capable of feeding an increasingly narrow section of society, that the possibilities of trade and commerce on the part of the vast majority became more and more negligible.

But whatever the case, generations of people did indeed forget.  They forgot who they had once been, to the extent that when Hellenic peoples, centuries later, wandered through the abandoned cities of Mycenae and Tiryns, those travellers and nomads – passing underneath the shadows of the great ruins of the looming ancient buildings – were certain that those same structures were not the constructions of their own human ancestors but rather the work of supernatural beings known as cyclopes.

And yet, with forgetfulness comes the dim shadow of submerged memory.   

At the turn of the millennium, when the Mycenaean world collapsed, as its buildings became beautiful mysteries and its written script became illegible to those sporadic peoples who wandered the world in its aftermath, nevertheless a new type of recollection was born.  Writing was lost, but the oral tradition reasserted itself; stories of great warriors and their deeds – men who had long since faded into shadows – stepped out from the mists of time; huddled by the fire against a backdrop of night, the lyrical spells of the wandering minstrels drew villagers close as they sung their stories of the fleet-footed Achilles, the great King Agamemnon, the beautiful Helen and the wily Odysseus.

A past whose empirical historical record had been erased by forgetfulness was in another sense reborn; in the strange, fantastical and mysterious idiom of myth and legend, the evocation of an epoch whose faint and distant outlines were no longer perceived by historians only now to be glimpsed by poets.

Sometimes the act of forgetting stirs memories anew.  Sometimes loss stimulates growth.

To return to the ancient world, one might alight upon the example of the dark lords of Hattusa, the brutal overseers of a city on a great hill which oversaw the militaristic Hittite empire, one which exerted a lethal territorial power and even succeeded in dealing mighty Egypt several defeats in prolonged and pitched battles.  And yet, the tributes which the Hittite kings could command from their vast territories came at a cost.  The resort to war and plunder as a fundamental means to nourish the civilizations life’s blood meant that the development of labour technology for use on the land – for genuinely productive measures – was increasingly stifled. More and more did a ruling elite develop which gorged itself on the type of luxuries which the spoils of war provided, but which showed less and less concern for the sort of innovation and economic development which would facilitate and raise the skills of farmers, artisans, scholars and even slaves.    The society which developed in the aftermath was top-heavy; a massive, bloated and decadent aristocracy, alongside a weak and anaemic civil society which, increasingly deskilled and isolated, began to slip back into older and more primitive forms of producing, gradually forgetting the skills and techniques its civilization had accrued.

But one group that wasn’t subject to the same gentle amnesia in the same way was the barbarian populations on the periphery of the empire.  If the civilisation itself was in the process of fading, the barbarians in the hinterlands were not only able to preserve some of the science and techniques which civilization had evolved, but also, on occasion, to uplift it.  The barbarians had been drawn into the culture and technology of the Hittite civilization – through colonial occupation for sure – but also by having been drawn into the orbit of empire in their capacity as mercenary soldiers who were paid to fight for the Hittite king.  As the great anthropologist Gordon Childes would write, ‘by such employment barbarians received a new lesson in civilization. They were apt to learn at least  ‘civilized’ methods of warfare, urban processes of armament manufacturing.’[4]

The barbarians imbibed some of the craft and science which civilization yielded while simultaneously retaining a degree of separation, of independence, from the elite class of kings, princes, aristocrats and their minions who were tied into a royal bureaucracy whose parasitic wealth had put a breaker on its ability to innovate.  The barbarians themselves, therefore, were in a position to forge developments and innovations which, hindered by the cloudy fug of decadence, the imperial elite were totally incapable of seeing through.

And so it was that the barbarians who lived in the region of Anatolia, on the edges of the Hittite centre of power, were the ones to carry through perhaps one of the most radical and fundamental innovations; the ability to smelt iron which would eventually yield a whole new epoch of human history.

But if losing touch with one’s own culture and civilization can provide the impetus to others to further develop those same elements, history also furnishes us with examples where the details of the past generations, having long since grown faint in the historical memory, are reworked and reimagined by those in the present for new purposes.  In the 17th century John Locke would argue that memory is the key to personal identity, but one should add that is also serves as the key to the identity of whole nations.

And yet, here too, memory can prove to be a slippery and errant guide:  what is it we choose to remember and how closely does that correspond to the past as it actually was?   Every English schoolboy or schoolgirl knows of Richard the Lionheart, a key emblem of English nationalism, a symbol of the ‘courageous’ way in which the English crusaders shaped their identity and their freedoms in the context of battling the Muslim ‘hordes’ in the Holy Land.

And yet, though it was true that Richard came to rule over a region labelled ‘England’ it had very little in common with the nation which exists today.  Firstly, its borders were a lot more permeable – the area was also constituted by much of what is modern day France and was always shifting in its parameters depending on conquest or the marriage of royal houses.

But perhaps more importantly, there was no single presiding language; the aristocracy tended to speak French, the clergy Latin, and the peasant majority old English.  Richard himself was probably raised in the English isles but it would have been unlikely for him to have spoken more than a handful of words in Old or Middle English.  His first, and likely only language, would have been an early form of French.  He spent the best part of a decade warring abroad in the crusades, and perhaps he spent only around six months of that time back in the English isles.

He was, for this reason, more of a glorified feudal warlord, looking to expand what by and large resembled a personal fiefdom through plunder and land seizure, rather than resembling a leader of a nation with a set of independent interests and cohesive boundaries.  As the late Victorian scholar William Stubbs would argue, ‘He was a bad king: his great exploits, his military skill, his splendour and extravagance, his poetical tastes, his adventurous spirit, do not serve to cloak his entire want of sympathy, or even consideration, for his people. He was no Englishman … His ambition was that of a mere warrior’.[5]

And yet, our historical memories have, in the main, managed to memorialize a different type of figure. The Richard I of  Marochetti’s 1856 statue, the Lionheart on horseback, his sword raised aloft, positioned outside the Palace of Westminster like a sentinel, ready to ward off those historical enemies which might try to breach the seat of Britain’s power.  In World War I, when the British captured Jerusalem, little patriotic postcards were issued which featured the image of Richard gazing down from the heavens above, with the caption: ‘At last, my dream has come true’.    Here the imperial aims and ambitions of a modern nation state are distilled by marshalling a figure from the past and resurrecting him, making him speak to the present in a fantasy-patriotic guise.  But such a fantasy figure had little in common with his genuine historical counterpart.

Perhaps such mythmaking occurs most of all in the gaps which forgetfulness creates.   In the Roman Republic of 300 BC, one assumes the citizens had little memory of the first villagers who fashioned a few mud-daubed huts by the river Tiber thereby founding the eternal city many centuries before.  So, into this void, they poured their own mythology; a story came to light of two brothers of aristocratic origin whom, having been abandoned at birth and raised by a she-wolf – they then came to found the city of Rome.   But just as with the myths which surround modern nation building, the story of Romulus and Remus tells us more about the nature of the present than the past.

The idea of two warring brothers speaks most eloquently to the Roman Republic in the third century where the two orders were constantly at each other’s throats; where the plebeians and patricians were battling one another for control of the Senate and, over the centuries, this sometimes spilled over into murder – consider the ill-fated Gracchi brothers who both served as Tribunes of the plebs but were assassinated for endeavouring to effect land distributions to the poor.  The fraternal ire which spills over into murderous violence at the apex of the Romulus and Remus myth is situated in the distant and forgotten past but it speaks most profoundly to the violent political dualism which had developed out of the conflicting classes of men in the Roman order during the present.

Sometimes, however, the issue is not about remembrance but about forgetting.  In order to facilitate the history of the colonizer, the history of the colonized must be erased.  One recalls the way in which the catholic conquistadores of the sixteenth and seventeen centuries looted the art of the indigenous peoples, melting sculptures down for their gold, burning manuscripts, raising temples to the ground in order to construct Christian churches out of the debris, building the edifice of a new culture upon the rubble of the old.  Conquest in some way demanded the systematic erasure of that which had been conquered, the denial of the history of those who had been pressed into servitude or slavery, perhaps because – the hope went – if one erases another’s history, one erases the sense of who they are too, who they were.  And if you can do that, the conquered themselves become a tabula rasa which you can rebrand according to the dictates of your own culture, you might hope to render them passive and supine through the fog of forgetfulness.

Or perhaps it is the nature of conquest and colonisation; faced with his or her victims, the colonizer is confronted by their own lack of humanity therein; to erase the victim’s identity, their history, their humanness – is this not simultaneously the endeavour to cleanse the stain of occupation?  To occlude the brutal and parasitical nature of the occupiers if only from themselves?  Sometimes this is stated in the most literal of ways.

To take a more contemporary example – the Zionist slogan which underwrote the occupation of Palestine and the creation of Israel at the cost of hundreds of thousands of displaced Palestinians was ‘a land without people for a people without land’.   With a single phrase, the colonized – the indigenous Palestinians who had lived and worked the land for centuries and even millennia – are airbrushed out of history, their identities erased from the mind in much the same way their physical presence had been cleansed from the land.

But the erasure of selfhood is not always carried out by other persons or other groups.  

Sometimes the loss of self is exacted by the remorseless rhythms of time and entropy working against physical matter.  The gradually decaying material of age-old cells cossetted deep within the brain, blood vessels bleeding at the microscopic level, elemental processes which slowly fug the mind in a great black cloud, gradually dissolving each and every memory of who and what you are.   This is what dementia means.  For we are our memories; they are the record of our unfolding, the lasting imprints and images of our dialogue with the world, the forms and shapes on which our personhood is structured.  When we begin to lose them fundamentally, it is then when we lose ourselves.

That is what makes dementia such a tragic illness from the purview of the human being. Cancer is awful, there is no doubts about that – most often it is painful and relentless and frightening and absolutely horrific to the physical body it swarms over.  But weakened, enfeebled and in pain as they so often are – cancer patients nevertheless tend to retain a sense of self.  What is so tragic about dementia is that the person who is subject to it simply begins to fade away within the physical body, until they are little more than a shadow behind the eyes.

And for the loved ones of the sufferer, the process is more painful and more potent still.  Seeing someone who was once lively, vibrant, cheeky, intellectually curious, belligerent, sometimes cruel, sometimes kind – watching them gradually slip away into the mists before your eyes, witnessing the childlike bafflement which comes from no longer being able to command your own thoughts.   The way the world suddenly rears up in all its vastness, appearing to them as something infinite and alien, a terrible stranger with whom they can no longer communicate.  The retreat into immediacy, into the comfort of sensation, like a traumatized child – desperate and lonely and isolated, clinging to a raggedy and moth-eaten blanket.

And perhaps most of all, those moments when one glimpses something of the former self.   They cut like glass.  When the person manages a joke, and in their tentative smile and soft eyes, you see something of the person who once was – their humour, be it ironic or coarse.  Or a fumbled memory from a childhood so long away, and the stuttering determination to seize hold of it, to elucidate it, to recover a sense of ‘self’ if only for a few moments.

That is where those with dementia live, always on the edges, always on the precipice between being and non-being, personhood and oblivion. 

And yet, there are moments of light in the unfolding dark.   In Making an Exit, the author Elinor Fuchs gives an account of her life with her mother, who is a dementia sufferer. What’s particularly interesting about this relationship is that when Fuchs was growing up, she was not so close to her mother.  Her mother was absent much of the time, pursuing her career, leaving her daughter to the charge of her grandparents.  The mother figure is portrayed as a charismatic and beautiful woman, strong and ambitious, creative and lively, and yet remote from her daughter’s life.  However, when she contracts Alzheimer’s in the twilight of her life, this revolutionises the relationship between them.

The increasing dependence of mother on daughter enables a new sense of warmth, intimacy and love to bloom.  Fuchs comes to learn about her mother’s playfulness and her fears from within the condition she is suffering:  ‘The more reduced you are, the more loving you are, everything else washed away – success, money, glamour, clothes, ‘things’ … It hardly matters now which us is the mother and which the daughter. Taking care of as good as being taken care of.  My job, to keep the little flame alive for just a while, to keep the little spirit in the world.’[6]

There is, therefore, an incredibly moving reversal in the relations between mother and daughter whereby the daughter becomes a repository of safety and security and the mother enters into a period of gentle, childlike innocence: ‘I call her “Little Bones” and sit her on my lap. “I love you,” she purrs.  I talk comforting baby talk.  “Little pussycat, my little pussycat.”  The memoir ends in a note of gentle melancholy, ‘the last ten years: they were our best.’[7]

I don’t know if it is possible to reach any clear cut conclusion from Fuchs’ incredibly moving account. The loss of personhood which is the slow and inevitable terminus of all dementia sufferers is a kind of living death, and it is difficult to see in that anything worth such suffering.  And yet, as Fuchs’ memoir illustrates, through the gaps of disintegration, moments of great humanity peek through; one’s relationship to the sufferer can be sharpened in its focus while at the same time being softened in terms of its compassion. In another memoir centred on a dementia sufferer – this time the author Alix Kates Sculman’s husband – the author reflects on her loved one’s loss of memory in the following terms:

That he forgets the concert or dance the minute it’s over hardly matters; even in undamaged brains sensual pleasure fades quickly.  Like the flavour of an exquisite dish or the quality of a given orgasm, the sound of a particular performance, however transporting, wont usually outlive the week.  Whereas once I kept whole libraries in my head, nowadays I can only recall vaguely the contents of  a book I read a month ago, but that doesn’t diminish my passion for reading in the slightest.[8]

Here we see that, as frightening as it is, dementia isn’t a condition which is alien to our humanity; rather it simply heightens what is already an integral part of all human experience – that is, the continual and constant process of shedding one’s memories, of forgetting.    The poet Elizabeth Bishop captured the paradoxical nature of this in a poem which is both light-hearted and mordant, playful and tragic:

The art of losing isn’t hard to master;
so many things seem filled with the intent
to be lost that their loss is no disaster.

Lose something every day. Accept the fluster
of lost door keys, the hour badly spent.
The art of losing isn’t hard to master.

Then practice losing farther, losing faster:
places, and names, and where it was you meant
to travel. None of these will bring disaster.

I lost my mother’s watch. And look! my last, or
next-to-last, of three loved houses went.
The art of losing isn’t hard to master.

I lost two cities, lovely ones. And, vaster,
some realms I owned, two rivers, a continent.
I miss them, but it wasn’t a disaster.

—Even losing you (the joking voice, a gesture
I love) I shan’t have lied. It’s evident
the art of losing’s not too hard to master
though it may look like (Write it!) like disaster.[9]

The poem was written apropos of the poet’s lover who committed suicide – which clearly imbues it with a tragic tinge – and yet at the same time there is a gentle and melancholy acceptance; that is to say, we all must lose, we all must forget; for this is the inevitable cost life rings, the toll one must pay in order to enter into existence in the first place.

Another moment of poetry which encapsulates the same paradox; the same sweet, strange sadness is the end scene in the film Blade Runner.   Here the Harrison Ford character is chasing down rogue replicant Roy Batty, but he misses a jump, and is close to falling to his death, whereupon the fugitive replicant saves him, raising him up to safety.  The moments where the dumb-struck bounty hunter recovers, are also the moments when the replicant is dying, his body only designed to live a short period of time.

In those final seconds, as the rain cascades over the dark outline of the dystopian cityscape, the replicant utters his last words.  Even though he is a synthetic creation fashioned in a laboratory, nevertheless he appears in that moment as the most human of all the characters, as he sums up what the life which is now ending has meant:

I’ve seen things you people wouldn’t believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.[10]

Notes.

[1] Arthur Conan Doyle, A Study in Scarlett, The Project Gutenberg, 12 July 2008: https://www.gutenberg.org/files/244/244-h/244-h.htm#link2HCH0002

[2] Joshua Foer, ‘Kaavya Syndrome’, Slate 27 April 2006: https://slate.com/technology/2006/04/no-one-has-a-photographic-memory.html

[3] Chris Weeler, ‘The Photographic Memory Hoax: Science Has Never Proven It’s Real, So Why Do We Keeping Acting Like It Is?’ Medical Daily 6 June 2014: https://www.medicaldaily.com/photographic-memory-hoax-science-has-never-proven-its-real-so-why-do-we-keeping-acting-it-286984

[4] V. Gordon Childe, What Happened in History (Aakar Books, Delhi: 2019) p.184

[5] William Subbs, The Constitutional History of England (Hard Press, Miami: 2017) pp. 550–551.

[6] Elinor Fuchs, Making an Exit: A Mother-Daughter Drama with Machine Tools, Alzheimer’s, and Laughter (Thorndike Press, Maine: 2005) p.283

[7] Ibid., p. 283-85

[8] Alix Kates Sculman, To What Love Is: A Marriage Transformed (Farrar, Straus & Giroux,  New York: 2008) p.170.

[9] Elizabeth Bishop, ‘One Art’ Poetry Foundation: https://www.poetryfoundation.org/poems/47536/one-art

[10] Hampton Fancher and David Peoples,  Blade Runner screenplay, Trussel: https://www.trussel.com/bladerun.htm#TOP

 

Tony McKenna’s journalism has been featured by Al Jazeera, Salon, The Huffington Post, ABC Australia, New Internationalist, The Progressive, New Statesman and New Humanist. His books include Art, Literature and Culture from a Marxist Perspective (Macmillan), The Dictator, the Revolution, the Machine: A Political Account of Joseph Stalin (Sussex Academic Press), Toward Forever: Radical Reflections on History and Art  (Zero Books), The War Against Marxism: Reification and Revolution (Bloomsbury) and The Face of the Waters (Vulpine). He can be reached on twitter at @MckennaTony