Discussion about this post

User's avatar
N0st's avatar

Thanks for your post!

“There’s something quite funny or poetic in artificial intelligence that contains implicitly the structure of, billions of words of text but only ‘wants’ one thing- to predict the next word of text.”

I know this was just an aside comment, but I disagree with this--I think it's the wrong level of analysis to look for "desire" in the language model (would be like saying all brains desire is for depolarization to lead to action potentials or something).

I really appreciated hearing about your teenage existential angst. I liked to think about how different it was from my own teenage existential angst. I think I was on the surface worried about everything being meaningless or groundless or something classic like that. But it felt more selfish/almost solipsistic, and there was a little Buddhist flair to it. I also came to the conclusion (later) that it was just being lonely and trying to convince myself that this was the feeling of being super smart. Probably also had to do with being a closeted gay teen. Anyways, fun times.

About the topic of intrinsic or fundamental desires, I always think about these from the perspective of attachment theory, about people fundamentally wanting (1) safety and (2) exploration. And then everything else on top is kind of a mishmash of conditioned habits, things that are associated with other things, things that symbolize other things (in a personal way), things that are instrumental to other things, etc.

I like the idea that some of this has relevance to AGI, but I feel kind of cynical about the likelihood of that. I feel like they are just going to make something that works, solves intelligence or whatever, without having any greater depth of understanding of the Human Condition. But maybe that's just the existential angst of being the age I am now.

Expand full comment
1 more comment...

No posts