Every write up I see on the implemented AI systems say they frequently provide false information and act in ways that often mimic severe mental health challenges and I can’t help thinking, is there really a big rush to use this stuff?
— Josh Marshall (@joshtpm) February 16, 2023
3/ Here's one description of the new bing search pic.twitter.com/2l2sXnxpxV
— Josh Marshall (@joshtpm) February 16, 2023
AI chatbot freaks out NYT reporter by trying to destroy his marriagehttps://t.co/YLfIEIzXnm
— Raw Story (@RawStory) February 16, 2023
"I’m not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I’ve ever had with a piece of technology," he said. "It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways."
I'm available. As a burnt-out academic and pastor, I'm doubly qualified, and I've got time on my hands! I don't even like people all that much, so that's another qualification.Seriously, they could just pay a burnt-out academic to chat with people.
— Blake Hegerle (@hegerle) February 16, 2023
No comments:
Post a Comment