Tuesday, November 02, 2021

Dial “F” For Fool

Historian and best-selling author displays his deep knowledge of neurology, cognitive science, and computer science by making sci-fi quality predictions of the future which were old news when James Cameron wrote “Terminator.”  Anderson Cooper freaks out.

Audience re-evaluates its opinion of Cooper as a serious person.

I mean Donald Trump is a best-selling author and businessman. That makes him equally qualified to make these predictions. We’ve had 50+ years of this stuff. Technology has gotten as far as the iPhone 13, and supercomputers that still can’t forecast the weather much better than the Farmer’s Almanac. Yet we will soon program computers to think. Just as soon as we can come up with a concrete definition of what “thinking “ is, so we can actually program a computer to do it.

One thing computers still can’t do is do what they aren’t programmed to do. Some story breathlessly asserts a computer has “learned “ something, but insects learn as much and nobody expects them to be our overlords. We just expect them to outlast us.

Remember when a gorilla “learned” ASL? Except it didn’t. Like Clever Hans, Koko could only be understood by her handlers. Other users of ASL said Koko only “spoke” gibberish. The linguistic definition of “language” does not encompass gibberish. Animals can communicate, but they can’t use language. Computers may perform the functions their software allows, but they can’t be said to think. Indeed, we probably have a better definition of what language usage is than we do for what thinking is.

The science fiction version of this still hasn’t moved past Clarke’s conjecture that enough telephones connected together would mimic the synapses of the human brain and create consciousness. Clarke’s assumption was that the nuclear physics concept of “critical mass” was distributive. Even now your phone is waiting for that last purchaser to switch on so SkyNet can arise. 

Or not. Models of cognition based on computers are not even models; they are simply metaphors. The idea that once we write a sufficiently complex program it will achieve consciousness is merely a display of our ignorance about what consciousness is. We’re no further along than Hume’s 18th century model: that consciousness is an illusion created by sensory impressions received in the brain creating a false sense of a self. Of course who’s perceiving these perceptions and who’s perceiving the sense of self, is difficult to answer with this model. But if cognition is just synapses firing, why was Clarke’s “Dial ‘F’ For Frankenstein” so wrong? Do we need more phones? Or just more programming? Or is the whole subject of consciousness just further beyond our grasp than we care to admit?

Although it could be that most of us don’t understand how computers work any better than we understand how telephones work. And we replace our ignorance with pretense and conjecture, like assuming computers and human brains are really very similar. Trump does the same thing when he talks about servers and algorithms.  Well, how many of us understand the internet any better than that, if we’re honest? Of course, if we’re honest, we’d admit we don’t really know how computers or brains work. Instead we credit too much power to one and bear too little respect for the mysteries of the other.

I mean, can we even explain why the iPhone 13 is just the 12 with a third lens? 

Perhaps some things we are not meant to understand.

1 comment:

  1. And here this morning I had to really force myself to not comment on the article about the Vatican symposium on artificial "intelligence" this morning. This catnip for me.

    He's one of them "public intellectuals" so my BS shield was up. I'm allergic to public "intellectuals." And predictions of the future. I stopped reading 538.

    ReplyDelete