TIMES, TIME, AND HALF A TIME. A HISTORY OF THE NEW MILLENNIUM.

Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.



Monday, November 8, 2010

The Problem with Memory 2: The Science of Memory

Memory chip. Image Source: Venture Beat.

How do we remember?  What does the brain do, exactly, to create memories? What are we to make of a report like this one at Live Science, which states that memory is not just a product of brain cells forming connections - wherein nerves reorganize themselves and send messages between themselves to establish a memory; but individual brain nerve cells can also hold short-term memories?  There's a piece here from October 25 at Phys.org which further explains how memories are born.  A memory is created when our brain makes groups of its cells "fire in unison" - each memory has a different pattern.  Scientists are trying to find treatments or prevention for Alzheimer's and dementia by administering drugs to older rats which stimulate their neurotransmitters.  This research has been headed by Profesor Etan Markus at the University of Connecticut.  An earlier report from 2006 on memory creation in the brain is here.

Aside from the obvious fears of aging Baby Boomers, why is there pressure to figure out how memory works?  Consider that those who know exactly how neurobiology and neuropsychology overlap in order that we may comprehend differences in time will conceivably be able to control, manufacture and bend memories - in advertising, in cinema, in public life, on the internet, in the military.  Phys.org just came out with a report that scientists have discovered how to erase memory: "Researchers working with mice have discovered that by removing a protein from the region of the brain responsible for recalling fear, they can permanently delete traumatic memories."

Surrounded by Space. 2009 © By Fredrik Broms. Image Source: The Independent.

Independent caption for above photograph: "Boreal Forest, Kvaløya, Northern Norway, 22 October 2009. Taken from the forest floor the Northern Lights are seen between the stars and through the tree tops."

A. S. Byatt speculated on where the body ends and the mind begins (or vice versa) in the Guardian in 2004 (here) (I typed the passage below following the original newspaper article at the time; there may be discrepancies compared to the linked online version). Her comment takes her from the physical mystery of how memories are created inside our minds and bodies to parallel conditions in the design of artificial intelligence.  It's a story of physical and electrical chaos (a period of 'pre-time') converging into mysterious patterns - Byatt clearly is fascinated by the 'jump' here from the tangible to the intangible:
James Strachey called it a ‘highly complicated extraordinarily ingenious working model of the mind as a piece of neurological machinery.’  MacIntyre argues that this ghostly neurological theory underpins the curiously spatial model of the Freudian psyche – Superego, Ego and Id. The Id is a swirling mass of normally inaccessible energy, not directly connected (as the ego is) to the external world by perceptions. Freud’s descriptions of it are poetic, and resemble the inferno of the Divine Comedy in some ways. It is a ‘cauldron of seething excitement. We suppose that it is somewhere in direct contact with somatic processes … it has no organisation and no unified will … only the pleasure-principle’. The Id has nothing that can be compared to negation, and ‘we are astonished to find in it an exception to our philosopher’s [Kant’s] assertion that space and time are necessary forms of our mental acts. In it there is nothing corresponding to the idea of time, no recognition of the passage of time, and … no alteration of the mental processes by the passage of time.

My own writing and thinking have been much influenced by Sir Charles Sherrington’s metaphors for mind and brain. Sherrington, who shared the Nobel Prize for physiology in 1932, was the first to study the synapses, and invented the term. (A synapse is the junction of two nerve-cells.) Most people know his description of the waking brain (the ‘head mass’) as ‘an enchanted loom where millions of flashing shuttles weave a dissolving pattern, always a meaningful pattern though never an abiding one …’ He also described visual perception with a metaphor. ‘The eye sends … into the cell-and-fibre forest of the brain, throughout the waking day continual rhythmic streams of tiny, individually evanescent, electric potentials.’ And he writes about the electric activities of the brain as a system of flashing lights ‘pursuing a mystic and recurrent manoeuvre as if of some incantational dance. They are superintending the beating of the heart and the state of the arteries so that while we sleep the circulation of the blood is what it should be. The great knotted headpiece of the whole sleeping system lies for the most part dark, and quite especially for the roof-brain. Occasionally at places in it lighted points flash or move … At intervals even a gush of sparks wells up and sends a train down the spinal cord…’

I love the combination of metaphors of light, of weaving, and of forests. I love them partly because they recall the primeval landscape of classical myth, with the fates weaving, and the heavens full of lights. I love them more because they are accurate – as to their physical origins – the brain-cells are indeed a forest, and are called dendrites from the Greek for trees, the nervous system is indeed a web of flashing electricity. But I love it too because it gives me a way of imagining my own mind-body (however finite and mortal), that is complex and beautiful. Sherrington was engaging about the incapacity of natural science to say anything at all about the relations of thoughts and the brain – ‘except as a gross correlation in time and space. In some ways this is embarrassing for biology.’

Sherrington’s flashing shuttles preceded the electrical networks of computer science. … For a long time I felt instinctively irritated – sometimes repelled – by scientific friends’ automatic use of the word ‘mechanism’ for automatic bodily processes. A machine was man-made, it was not a sentient being, a man was not a machine. Understanding the flow of electrical and chemical signals – or at least knowing that they were there – changed that to an extent. Before computers there were images of mechanical beings – touching like the Tin Man in the Wizard of Oz, nightmarish like H. G. Wells’s Martians, sloshy organisms in killing machines, like brains in bodies, whose descendant was Lord of the Daleks.

Jean-Pierre Dupuy’s brilliant The Mechanization of the Mind (1994) describes the origins of cognitive science in the meetings of the Cybernetics groups between 1946 and 1953. The scientists involved wanted to construct a science of mind that would describe the relations between mind and matter. They assumed that the mind operated like a machine, and that physical laws explain how nature can appear to have meaning. Dupuy argues persuasively that our sense of cybernetics as a deterministic inhuman system is a travesty. Machines and minds can reveal each other to each other, if we get the algorithms and the metaphors right. Ulric Neisser’s Cognitive Psychology (1967) explores and criticises the analogies between the then existing computer systems and brains, and describes the early ‘neural networks’, remarking that the way forward might be more precise investigation of ‘the wet stuff’.

Since that book there has been an explosion of research precisely into the wet stuff, into the brain itself. The work is done with what we must think of as bodily prostheses that show us what our bodies’ original perceptual systems are not equipped to contemplate. Microscopes, spectroscopes, computers. Richard Gregory said in the 1960s that we now think about worlds we can neither see nor touch. Marshall McLuhan said we live in a social world of prostheses, things added on to the body – telephone, television, cameras – which drastically change our human relations and perceptions of each other, and ourselves.

Philosophers and neuroscientists are constantly redescribing the mind-body problem. Ian Hacking is a philosopher who wrote a book entitled Rewriting the Soul (1995), which analysed those uneasy phenomena, multiple personalities in one body, and false memory syndrome. Hacking is interested, he says, less in whether the diagnoses are right or wrong than in the normative, unquestioned idea that the cause of disruptive behaviour lies in the lost past. It became normative, unremarkable, he says, ‘because memory became the way to have knowledge of the soul’.
Part of the new, Millennial sensibility is the huge interest in technology to cross boundaries.  One might almost say we are 'bewitched' or mesmerized by the powerful potential of our new tools and corresponding abilities.  Suddenly, in every direction, there is potential.  Suddenly, there are no limits.  Suddenly, everything is fair game.

Every previous limitation we imagined now has had the fog clear around it.  A veil lifted, and we suddenly perceive thousands of new Mount Everests to be scaled, in innumerable fields and areas of human endeavour and even in daily life and society.  As our tools and machines (our prostheses) improve, they become seemingly objective interpreters.   They feed knowledge of the mysteries of how we work back to us.  Don't know how memories work?  Take pictures of the brain, then the cells in the brain, then the atoms in the cells, while someone remembers something.  Where will this big push take us?  - This obsession with going ever further, to go smaller and smaller, will compel us until we can see the very building blocks of matter and even the unseen black matter building blocks on the other side of our very dimension of perception!  This is in fact, empiricism and positivism run rampant.  There is a literal-mindedness to all this, to this conviction that we can figure out the mysteries of reality if we can figure out how 'the machine' (our brains, the universe, atoms) works. 

In one sense, it's extremely strange that empiricism is running out of control, because for quite some time, it looked like the other side was winning.  Positivism, which was the hallmark of nineteenth century scientific enquiry, the standard bearer of 'scientific progress,' and which originally referred to the rational use of scientific method (rather than, say, neo-Kantianism, phenomenology or hemeneutics as means to understanding), has in the last half century been fiercely attacked by Postmodern theorists, but before that by Marxists, Romantics and various other Antipositivists.  This is loosely and often erroneously imagined as the quarrel between various polarities: the quantitative and qualitative, sense and sensibility, reason and faith, materialism and idealism, the rational and the intuitive, between 'bean counters' and 'big thinkers.'

But with the search for Dark Matter well underway in big particle accelerators, we begin to see how the Technological Revolution has changed everything.  Before the Tech Revolution hit full force, up to about 1990 or so, philosophers and intellectuals happily immersed themselves in debates over the relative worth of a posteriori and a priori knowledge.  And no matter what label you slapped on it, be it religious, scientific, sociological or political, be it inside, outside or between these areas of inquiry and debate - we were still singing the same old song.  It was a star-crossed lovers' spat between matter and mind, a troubled romance we told over and over to ourselves.  Again, for the past half century, it looked like Antipositivism was winning.  The illogic of that turn was redressed by the fact that much of Antipostivist thought is dressed up in comforting hyper-rationalizations that operated according to their own rules - this finally boiled down to Postmodernist philosophies, the best (for some) of both worlds.

For all their appeals to rationalism, atheism and secularism, much of the power of the Baby Boom generation's revolutionary drive derived from their confident, anti-rationalist adherence to the a priori side of this age-old argument.  This is the source of their boundless idealism.  They are, after all, a generation of labels and labeling - both of themselves and others - and their radical power comes from the created myth, the self-propelled illusion that dominates all perception, even if an alternate reality exists and can be confirmed.  This is core impulse behind a mentality of mass narcissism, a label which this generation initially happily shouldered as the 'Me Generation.'  Strangely, that negative label has as of about 2006 been conveniently off-loaded onto anyone younger (see reports here, here and here).  Yet for all the positive-negative label-swapping, the basic resort to labeling and myth-making remains the same.  This way of thinking has informed such concepts as cultural relativism, moral relativism, and social constructionism.  When confronted with data messages from cameras, technology, or other 'objective' observers however, the response of these savvy myth-makers is usually cognitive dissonance, followed by confusion.  At first, that was fun.  This has been a favourite theme in David Lynch's Lost Highway, Blair Witch 2, REC, Big Brother, Paranormal Activity and many other films and shows over the past fifteen years.  These dramas toy with the discrepancy between subjective and objective realities.  In these scenarios any tension generated was merely a frisson of fear that objective reality might be the 'truth' and subjective fantasies fiction.  Viewers could always walk away certain that the prevailing orthodoxy since the end of the Second World War, which was building for at least one hundred years before that, was that subjectivity was king and would remain so.

Until now.  Now, these old arguments between realism and idealism are fast becoming meaningless.  What is the point of discussing thought or emotion or memory as a Postmodern or hermeneutic value when it could turn out to have a mathematical value, expressed as a relational equation between matter and Dark Energy, with every subatomic particle accounted for?  In other words, what happens when we develop the technology that lets us count every last bean?  What happens when we discover that changing someone's memories doesn't involve elaborate brainwashing but rather simply requires the physical nano-removal of a protein from someone's brain?

For the time being, there are two dominant ideologies co-existing.  One is the declining mysticism of evangelical secularism that reached its height from the 1960s through to the mid-1990s.  The other is a nascent system of thought, developed by informatics and tech theorists already drunk with the promise of boundless knowledge and power.  Interestingly, some of the foremost theorists on the Singularity are Boomers, who have transferred their allegiance to become the vanguard of the new way of thinking.  Again, though, I'm not quite sure they've caught the shift of focus.  That is, they will be beholden to a relentless new faith in super tools.  And those tools, by their very nature, demand practice over theory, engineering over art, design over architecture, working application over considered wisdom.

In this new stream of thought, we really do think that we can solve any problem, and scientifically deconstruct any mystery.  That's a shift in mentality that is very different from the seeming rationalism of the 1960s, which was in fact highly idealistic and deeply anti-rationalist in its modes and methods.  After decades of being beaten into submission, positivism popped out of nowhere, and made a giant resurgence on the back of technological change, all in the space of the past decade and a half.  It's inspiring a new kind of confidence that reality can be measured, understood and controlled (as opposed to the alternate confidence, based on the idea that reality cannot be understood, but can be masterfully manipulated). 

This new confidence already is as troubling as its predecessor.  It implies a fundamental, unknown shift in values.  We are so close to our technology right now, so influenced by it, that we are desperately changing ourselves and our lifestyles to mirror it, rather than the other way around.  Futurists used to think that robots would be made to look like humans.  They never imagined that people would willingly, obsessively conform to the pace of computer capabilities and turn themselves into robots.  In this case, whatever side we take in the old argument between sense and sensibility hardly matters.  The change which we are enduring is, on the most fundamental level, quantitative not qualitative.  Does it matter whether you believe in God or whether you are an ardent atheist, whether you are liberal or conservative, if you spend twelve or more hours per day sitting in front of a computer, watching TV, and using a variety of techno-gadgets?  At the end of that day, neither God nor the rational mind wins in the mind of the beholden user - the machine does.  On the other hand, futurists did imagine nano-circuits for machines which are so small that they will no longer be mechanical, and instead become physical, mirroring our own neurological pathways.  Our near future lies somewhere in between those two potentials.  The Singularity won't be a 'progress' climbing ever higher as we profit from exponential knowledge.  Rather, it will involve a convergence between humanity and non-humanity on a single plateau.

See all my posts on Memory.

No comments:

Post a Comment