Monday, April 11, 2016

"or your shadow at evening rising to meet you...."

This is what happens when you let an engineer talk about biology and philosophy:

But, of course, that makes little sense. Any A.I. will face some limits and constraints, but they will certainly be different than those that have shaped E.I. No A.I., for example, will be limited by the head size that can fit through the birth canal (the so-called obstetrical dilemma where evolution must balance larger craniums against a birth canal limited by bipedal form). And it is also doubtful that an A.I. will need to have the “follow the crowd” reflex so characteristic of humans when they become part of a mob. Perhaps more fundamentally, why should an A.I. need or use emotion the same way an E.I. does, as a convenient way to abbreviate longer decision-making processes (a mental shortcut called an affect heuristic)?
The "obstetrical dilemma"?  Doesn't that presume that larger brain size=larger mental capacity, in some kind of crude one-to-one ratio of function and mass?  Didn't Stephen Jay Gould eviscerate this baseless nonsense decades ago?

Apparently not.

And the easy way emotion is elided from intelligence recalls the old dichotomy (still very much alive, but it shouldn't be, either) that it is our emotions that constrain our reason, and without them we would be better thinkers (Thanks, Enlightenment!  Please to be going away now....).  This is the same fundamental idea behind the author's idea of "evolved intelligence," because we all know evolution is about improvement and "progress;" except, of course, it isn't at all.  And don't get me started on the idea human intelligence is full of "kludges" that prove we don't think as well as we should.  By what standard, pray?  That of economics (his example)?  So if we were better thinkers, we'd be wiser, or at least happier, consumers?  And how did that evolve?

And why should "an A.I. need or use emotion the same way an E.I. does"?  Maybe because that's the way "thinking" works?  Maybe because thinking, as humans recognize it, is not limited to ratiocination?  And to so limit it is to engage the worst nightmares of science fiction writers for the past century?

Again, this has been debated for some time.  Maybe you could bother to join the conversation and listen, instead of speak first and listen later.

No comments:

Post a Comment