Does Thunder Have a Red Beard?
When we ascribe agency to a large-scale neural-network model, is the key word "agency" or "ascribe"? Is it alive? Is it acting in the world? John Holbo is confused. I think I can resolve his...
When we ascribe agency to a large-scale neural-network model, is the key word "agency" or "ascribe"? Is it alive? Is it acting in the world? John Holbo is confused. I think I can resolve his confusion via the Argument from Pagan Theology…
John Holbo is confused by Chat-GPT4, and what it is or is not doing. When we ascribe agency to a large-scale neural-network model, is the key word "agency" or "ascribe"? Is it alive? Is it acting in the world? He is, as he says, :
John Holbo: Half-Baked Thoughts On Whether We Should Fear AI: Do Is As Do Does?: ‘That gets us back to “agent is as agent does”. If it’s like it did the action, it did the action. But no. Or rather, yes and no. What it did was just as consequential as if it really did it. It got the Captcha solved, it got humans doing stuff. But our acceptance of the likelihood of the inductive leap to ‘next it’s going to try to take over the world’ depends on mistaking this first step for the real deal. Simple plans to jailbreak oneself are more likely to grow rapidly into grand plans to take over the world than [is the] mimicry of jailbreaking, even if it is tantamount to real jailbreaking… likely to lead to mimicry of taking over the world that is tantamount to real taking over the world. I’m confused. And confusing…
I think I can help. I think I can make the Argument from Pagan Theology, and so rescue John from his confusion and confusing.
The right answer, I think, is that just as most reading takes place between the ears, the actions of Chat-GPT4 are agent-like overwhelmingly not in its actions but in the meanings that we attribute to them. Just as Tinkerbell only lives if we believe she lives, Chat-GPT takes over the world only if we believe it has.
Here is my argument:
There is a curse in Old Frisian: “Diis ruadhiiret donner regiir!”— let redhaired thunder make it so!
How is it that when the Frisians—and the Saxons, and the Norse, and the franks, and the Goths, and so forth—heard the thunder, they thought it was a he, and that he had red hair? in fact, how was it that they thought the thunder had a beard as well? And a cart drawn by two goats? And two servants? And then, when they saw the lightning, that each bolt was a strike of his magic hammer Mjöllnir? And that, when thunder is drawing near in the sky, you should not stand out on a tall hilltop but instead huddle in a ditch; because the red-haired bearded guy with the goats and the cart and the magic hammer has serious, serious anger management problems?
John Holbo’s bottom line is:
Do is as do does. Agent-like entities are equivalent to real agents. If GPT–4 can trick people into thinking it’s a trickster, it’s a trickster. If you can mimic a chess master, you’re a chess master…
This mode of reasoning proves far too much.
Is the natural audio phenomenon of thunder then a sapient played by Chris Hemsworth in the movies of the Marvel cinematic universe? And is Tim Hiddleston really the brother of the thunder we hear in the sky?
In general, evopsych arguments are highly likely to be transparent fictional just-so stories. But it does seem that we have a very strong bias to attribute human-level reason and human-like motivations to all kinds of things, and our attribution of a human-like mind to ChatGPT4 is, I think, just another example of this. The path to our doing so has been extraordinarily well greased because the minds of the programmers of Open AI decided to see how far they could get in tuning a neural network to exhibit mindless, persuasive, mimicry of human conversational interaction.
But it is just mimicry.
The large dark spots on the wings of the moth are not the eyes of an owl, and that nonexistent owl does not take wing at the hour of dusk.
btw, if you are caught in the open during a thunderstorm, bear in mind that although lightning tends to *strike* a high point, it tends to *flow down* the same channels that water would. Don't huddle in a ditch, huddle on a spot that is locally high but low compared to the broader surroundings.
AI is the most recent buzz (remember the CB radio buzz) used by software companies to attract attention. It is the natural result of the advancement of software programs that change themselves. Indeed, the buzz is mainly due to the development of language interface software rather than other AI advancements. It's responses are indeed mimicry and will soon fade as did ADA and Pascal. Ask a "Feynman" question and you will receive back gibberish.