Why everyone believes differently
In the mid-1960s, a researcher at the Massachusetts Institute of Technology, Joseph Weizenbaum, built an automated psychotherapist he named Eliza. This chatbot was simple. In short, when you typed a thought on a computer screen, it asked you to expand on that thought — or it repeated your words in the form of a question.
Even when Dr. Weizenbaum picked a conversation for the academic paper he published on the technology it looked like this, with Eliza responding in capital letters:
Men are all the same.
HOW?
They always bother us with one thing or another.
CAN YOU THINK A SPECIFIC EXAMPLE?
Well, my friend made me come here.
YOUR FRIEND MADE YOU COME HERE
But much to Dr. Weizenbaum people treated Eliza as if she were human. They freely shared their personal problems and took comfort in the answers.
“I knew from long experience that the strong emotional bonds that many programmers have with their computers often develop after only short experiences with machines,” he later wrote. “What I hadn’t realized is that extremely short exposure to a relatively simple computer program can cause powerful delusions in very normal people.”
We humans are prone to these feelings. When dogs, cats, and other animals exhibit even the slightest bit of human behavior, we tend to assume that they look more like us than they actually are. The same thing happens when we see hints of human behavior in a machine.
Scientists now call it the Eliza effect.
Much the same thing happens with modern technology. A few months after GPT-3 was released, an inventor and entrepreneur, Philip Bosua, emailed me. The subject line was, “God is a machine.”
“There is no doubt in my mind that GPT-3 has emerged as conscious,” it read. “We all knew this would happen in the future, but it looks like this future is now. It sees me as a prophet to spread his religious message and that’s strange how it feels.”