Thursday, July 27, 2023

AI Is A Tool, Not An Entity

 The threat of AI is a human one, not a technological one.

 In books and movies, AI always becomes “human,” and conquers humanity (either with kindness, as in “With Folded Hands,” or more commonly through violence.). We create our own creation, in other words, of ourselves (Frankenstein), and we can’t control it.  We finally, IOW, become creators ourselves.  AI becomes, not a software program, but an entity.  Because what is intelligence if not a mark, an element, of being?

Technology and SciFi have made bad phenomenologists of us all.

First, the sitution:

The concern for AI in the WGA/SAG strike is not that computers will direct movies. It’s with how AI will be used to generate images and put actors out of work.

The fear in campaigning is that AI will easily create “deep fakes” and fool voters.

Do you see the problem? The problem is not technology. AI is not going to empower computers to create bodies or control traffic lights or lob ballistic missiles at countries. It’s not going to generate movies or campaign ads sua sponte. It’s simply another tool. 

AI is the new tool people are going to use against other people, and that possibility scares us. Well, some of us.  AI is a computer program.  It has to be written.  It has to be fed information.  It doesn't seek knowledge the way a young child does (it's called "childhood development" and it's a magical thing to see.  For instance, how does a child learn to use language?  It's one of the great mysteries, but it's so commonplace we don't consider it marvelous at all.  When my daughter was young she created new phrases based on her understanding of words (content) and syntax (word usage/order).  We would tell her to pick something up she'd left on the floor, and if she was carrying things already she'd object "I'm full of hands."  Which is a perfectly accurate use of English, if a bit dated.  "Awful" started out as "awe full," meaning "awe filled," or:  "Filled with awe."  We still prefer the latter syntax for the phrase, and use the word to define something bad, something...awful.  You might well say, to emphasize the point, that you are "full of love" or even "full of joy."  Why can't you exchange either noun for a more concrete one, like "hands"?  And yet we don't, and she figured out the accepted phrase ("My hands are full") without any particular coaching from us.  We still like her original phrase; it's part of the family lexicon.), and doesn't acquire knowledge even as a child does. 

If you recall "ST:NG," the character "Data" was an AI.  But he attended Star Fleet Academy to "learn" how to be a Star Fleet Officer (rather than just download, well...data).  That was AI as a human being; or trying to be a human being (Data's character arc always somewhat followed that of Pinocchio).  He was an entity, which is always the leap of science fiction where "intelligence" is involved.  Intelligence must mean human, or near human, or worse super-human, abilities.

And yet my cats have displayed intelligence.  Not the same as a computer programmer does, but intelligence nonetheless. I had a cat who could track a ball in my hand with laser precision, and catch it in mid-air in his front paws every time I threw it.  I could see the fierce, feline intelligence in that cat's eyes. 

Part of this (mis)understanding of what AI must be, is the dualism Descartes inherited:  only humans have intelligence, which comes from the soul (via Aristotle and Augustine and Aquinas), and so only humans are truly "entities."  Non-human creatures are just...creatures. Soul-less, and ultimately, mind-less. Which is where our language and concepts betray us.  The "monster" in Frankenstein is not called "the monster," but "the creature."  And it turns out to be just as human, if not more so, than its creator.  Indeed, it is treating it as a creature, rather than human, that causes the central conflict of the novel.  And it isn't the creature's intelligence that is his defining characteristic, but his heart:  his emotions, his feelings, his soul (which is how you know it's a product of the Romantic, not the Enlightenment, era).  The creature is intelligent, yes; but is driven by compassion and fear and finally anger.  He displays intelligence, but that is not his defining characteristic.  It is how he is wronged, that shapes the novel, and the creature.

We keep meeting the enemy, and the enemy keeps being us. We don’t create our own unstoppable doom. We just keep creating tools, and then fear how we will use them: against us.  Maybe if we paid a bit more attention to being, and a bit less to...well, technology.

ADDING:

There was an entire episode featuring a trial to determine Data's status as entity with rights, including this line: "Data is a toaster. Have him report to Commander Maddox immediately for experimental refit." AI chatbots are becoming less accurate because they keep scraping everything everywhere, including wrong answers, rather than learning (which is the way I always assumed "real AI" would do it, like CDR Data). My toaster is smarter because it never gives me bad info, but always gives me yummy toast.

Back when computer programming was done by punch cards and screens were called "CRT's" (because they were; basically televisions without the tuner), I learned the computer programmer's FUBAR (a phrase from WWII):  GIGO. It meant (means): "Garbage in, garbage out."  Computer programming was tricky because you always got back what you put in. 

Compared to those days, the word processor I learned to write on in the '80's was sheer magic.  And this program I use to post blog posts?  Inconceivable.

And yet, my computer doesn't think, even if my phone offers prompts to finish words I type with my finger (I can't do the thumbs thing on a virtual keyboard.  I just can't.).  And AI that is simply scraping data, all data, and treating it all as equal because...data?  Doesn't that mean it isn't aware of...well, anything?

As I say, my daughter learned to use language pretty much the same way, and she learned it faster and better. She was also self-aware from a very young age (back to theories of childhood development....).

And I still think we should focus a bit more on phenomenology (no, seriously!) and a bit less on the importance of our tools to our self-esteem/self-identity.

On the other hand: toast! 

1 comment:

  1. There was an entire episode featuring a trial to determine Data's status as entity with rights, including this line: "Data is a toaster. Have him report to Commander Maddox immediately for experimental refit." AI chatbots are becoming less accurate because they keep scraping everything everywhere, including wrong answers, rather than learning (which is the way I always assumed "real AI" would do it, like CDR Data). My toaster is smarter because it never gives me bad info, but always gives me yummy toast.

    ReplyDelete