2 Comments
User's avatar
N0st's avatar

Thanks for your post!

“There’s something quite funny or poetic in artificial intelligence that contains implicitly the structure of, billions of words of text but only ‘wants’ one thing- to predict the next word of text.”

I know this was just an aside comment, but I disagree with this--I think it's the wrong level of analysis to look for "desire" in the language model (would be like saying all brains desire is for depolarization to lead to action potentials or something).

I really appreciated hearing about your teenage existential angst. I liked to think about how different it was from my own teenage existential angst. I think I was on the surface worried about everything being meaningless or groundless or something classic like that. But it felt more selfish/almost solipsistic, and there was a little Buddhist flair to it. I also came to the conclusion (later) that it was just being lonely and trying to convince myself that this was the feeling of being super smart. Probably also had to do with being a closeted gay teen. Anyways, fun times.

About the topic of intrinsic or fundamental desires, I always think about these from the perspective of attachment theory, about people fundamentally wanting (1) safety and (2) exploration. And then everything else on top is kind of a mishmash of conditioned habits, things that are associated with other things, things that symbolize other things (in a personal way), things that are instrumental to other things, etc.

I like the idea that some of this has relevance to AGI, but I feel kind of cynical about the likelihood of that. I feel like they are just going to make something that works, solves intelligence or whatever, without having any greater depth of understanding of the Human Condition. But maybe that's just the existential angst of being the age I am now.

Expand full comment
Philosophy bear's avatar

Yes, I think this is right actually. What 'desires' GPT-3 has depends on the level of analysis. Like if you ask it to roleplay as a pirate, in one sense it embodies a character that acts like it 'desires' doubloons. What is the right level of analysis- the character or implicit author behind the the current text? I'm not sure.

I wrote about related topics here: https://philosophybear.substack.com/p/regarding-blake-lemoines-claim-that

On intrinsic desires, I think it's very important that we distinguish the 'original' drives of the creature from its final intrinsic desires. Even if everything was forged in the twin wants of exploration and safety, that doesn't mean those cover everything the creature intrinsically desires.

I suspect you're right that we'll 'solve it' (re: AI) without any fundamental insights into humanity- or if there are fundamental insights, they'll come about as a result of advances in the technology rather than vice-versa. Still, a man can dream....

Expand full comment