Lapham's latest Quarterly opens with the words of Dmitry Iskov*, offering a solution to the theme for the quarter: death.
One way for civilization to move to the next level of development is for humanity to develop and pass a new evolutionary strategy as soon as possible. The foundation of this strategy is constant and intensive development, reaching a higher level for controlling reality, toward new ideas, meanings, and values and the creation of a fundamentally new model for the existence of society: spiritual, humanistic, ethical, and high tech.The note at the end of this excerpt (of which I've only given two paragraphs) notes drily that the question was posed to Iskov, "a multimillionare and former online-media mogul," by the New York Times: "Are you insane?"
As our own potential is developed and revealed, our individual consciousness will become complex, flourishing, flexible, and playful. Multivaried and paradoxical, it will inevitably begin to come into conflict with its limited moral, protein-based carrier--the biological body. Overcoming this conflict will be the main stimulus for a scientific and technical breakthrough. The development of NBICS technologies (nono-bio-info-cogno-synthetic) opens up possibilities for creating self-organizing systems capable of reproducing the functions of life and the mind in nonbiological substrates. Over the coming decades human beings will gain a new, practically immortal carrier of the personality. This is the path of replacing biological evolution with cybernetic evolution. In these transformations lies the essence of a new strategy for the development of society--the strategy of spiritual and bodily evolution, or evolutionary transhumanism.
Well, if he is, so is Stephen Hawking:
"I think the brain is like a program in the mind, which is like a computer," Hawking said last week during an appearance at the Cambridge Film Festival, The Telegraph reported. "So it's theoretically possible to copy the brain on to a computer and so provide a form of life after death."One could say that Hawking, like Dawkins, has now abandoned the hard work of science for that sweet, sweet publicity of saying whatever the hell you want and sounding "science-y" about it.
He acknowledged that such a feat lies "beyond our present capabilities," adding that "the conventional afterlife is a fairy tale for people afraid of the dark."
I was going to leave these two statements as a study in contrast where we still look to authority to tell us what to think: where Iskov is simply a businessman who gets attention because he's rich, is anybody going to take these remarks from Hawking and ask him the same question the NYT put to Iskov? Why not? Because Hawking is a scientist?
Let's start with "the brain is like a program in the mind"? What the hell does that mean? Isn't the brain the "meat," and "mind" the program? And don't we all understand the mind/brain=a computer is just a metaphor, like saying the human body is a magnificent machine? But more to the point, how the hell is it "theoretically possible" to "copy the brain on to a computer and so provide a form of life after death"? Because dualism is true, and there is a mind/body split? Funny, but that idea originated in the notion of an afterlife (somebody get Mr. Hawking a copy of the Phaedo, will ya?).
The very notion of the brain being a computer and mind the program is kinda silly, really. It doesn't explain neuroses or psychoses, or simply the range of emotions that make human thought possible (is curiosity an emotion? Or a product of rational thought? Without it there is no knowledge, but really, what's rational about curiosity?)
I dunno; the whole concept is just daft. If I speak of the brain as a computer, it isn't because it's true, but because I have a new referent available for the metaphor of thought and cogitation and perception and emotion and memory and all the myriad things that go on in that 3 pounds of meat between your ears. I was listening to a radio program today about strokes and aphasia, and victims of aphasia reported knowing the word they wanted to use, knowing they wanted to use it, but being unable to do so. Is that a "hardware" problem, in our computer analogy? If it is, doesn't that make the "software" the self, which is aware of this condition about which it can do nothing, but which it is aware of? As Marilynne Robinson puts it:
By "self-awareness" I do not mean merely consciousness of one's identity, or of the complex flow of thought, perception, memory and desire, important as these are. I mean primarily the self that stands apart from itself, that questions, reconsiders, appraises. I have read that micoroorganisms can equip themselves with genes useful to their survival - that is, genes conferring resistance to antibiotics - by choosing them out of the ambient flux of organic material. This is not a pretty metaphor, but it makes a point. If a supposedly simple entity can by any means negotiate its own enhancement, then an extremely complex entity, largely composed of these lesser entities - that is, a human being - should be assumed to have analogous capabilities. For the purposes of the mind, these might be called conscience or aspiration. We receive their specific forms culturally and historically, as the microorganism, our contemporary, does also when it absorbs the consequences of other germ's encounters with the human pharmacopoeia. Let us say that social pathologies can be associated with traumatic injuries to certain areas of the brain, and that the unimpaired brain has a degree of detachment necessary to report to us when our behavior might be, as they say in the corrections community, inappropriate. Then what grounds can there be for doubting that a sufficient biological account of the brain would yield the complex phenomenon we know and experience as the mind? It is only the pertinacity of the mind/body dichotomy that sustains the notion that a sufficient biological account of the brain would be reductionist in the negative sense. Such thinking is starkly at odds with our awareness of the utter brilliance of the physical body.The very idea that mind should go on without body is rooted in the notion of duality, the idea that mind is (or should be) permanent, while body is corrupt and impermanent, and how unfair is that!? If the idea that one is separated from the other in death is silly, what's rational about the idea that one can be separated from the other by technology?
So there I go again, running on longer than I meant to. But seriously: the mindless reductionism of Hawking on this issue is of a piece with his concluding statement, which is supposed to mark him as a "serious thinker" on this topic, not some benighted fool. And yet, having recently re-read Jonathan Edwards' "Sinners in the Hands of an Angry God," the seminal American work on damnation that still echoes in American pulpits and TV screens (subdued at times, but never discarded, never forgotten), I can't for the life of me imagine why that would be regarded as "a fairy tale for people afraid of the dark." Most of the people I know who consider an afterlife (or even reincarnation) as a valid concept, are more afraid of the afterlife than the dark.
Which has always pretty much been Richard Dawkins' point. Maybe he and Hawking need to get together and settle this issue by sharing their ignorance about it. They'll find 0+0 still = 0, but I'm not sure they'll understand they're starting with 0.
*not, unfortunately, available on-line