[FN: - the role of sexual selection complicates, but does not defeat, the story I will be sketching in this essay, however I’m going to bracket it because it’s not necessary to the narrative, and deserves its own essay.]
A conservative friend of mine once was trying to convince me that the good is that which tends towards biological self-propagation. Morality, then, rejects homosexuality and celibacy, embraces family, and so on. I actually think it’s only a slight exaggeration to say the inverse is true.
All genuinely human values, everything that has ever really mattered to us, was a spandrel built on our biology- if it serves a function it is only due to our biological imperfection. In some sense, to be a person is to have interests that are distinct from your interests as an organism, the more personhood something has, the more ontologically distinct its interests are from the biological organism that makes it up.
No doubt those personal interests- what we desire- came about as a result of biological history. Evolution is woven through our psychology. But biology is imperfect and accomplishes its ends through cheats and shortcuts- that’s the limited sense in which I am using ‘spandrel’ here- not necessarily without function but without function in a possible, more efficiently built alternative organism. We can imagine a more biologically streamlined version of humanity that has no need whatsoever for art, religious ecstasy, or probing moral self-questioning. Such streamlined people look something like the vampires in the Peter Watts book Blindsight.
In Blindsight, the vampires are a subspecies of humanity designed to prey on humans. They’re extremely intelligent, both analytically and as memory savants, but they lack attributes like conscience, self-reflection, an aesthetic appreciation of art and so on. Peter Watts suggests that these things are spandrels- evolutionary accidents, or perhaps that they are less than optimal evolutionary solutions- evolution, after all, makes do. He imagines the vampires as more competitively fit than humans, with just a few accidental weaknesses of their own- weaknesses that the humans foolishly remove.
Call that something extra- the collection of spandrels that Peter Watt’s vampires don’t have the parasite. The parasite is everything within us that is superfluous, even detrimental, to our propagation [or as above, would be in a more perfect organism].
Call the part of us that tends towards survival and propagation the organism.
Right now the organism and the parasite are not really distinct. Art, love, a certain kind of moralistic self-consciousness- all have biological utility for the moment- but there are more efficient alternatives, ways the same functions could be fulfilled in other ways, or set aside. One day this will leave us with a choice- do we care about the parasite or the organism?
When Marx, for example, talks about leaving a world of necessity and entering a world of freedom, he’s talking about a post-scarcity environment in which we never, ever, have to make the “hard choices” ever again. With only a little hyperbole we could say he’s talking about the overcoming of the organism by the parasite.
I told (another) conservative that this was my ideal end to history once on Slate Star Codex- conquering and torching the realm of necessity. He scoffed and said it was impossible, but I couldn’t help but think he didn’t want it to be possible- he didn’t want humanity to escape the realm of necessity because, for him, existence without the struggle for existence was hedonistic frippery (I recall he made some quip about an Ian Banks culture-style civilization fucking dolphins). I must say, I find his horror amusing. But if hedonism is your fear, take comfort, ascetism and crying over being unable to sing the note right are as much part of the parasite as is non-reproductive sex.
Let me put it to you. Do you want the parasite to continue? Once upon a time we dreamed of a future in space to which human intelligence and agency was still a strategically relevant factor- parasite empires among the stars. That’s not going to happen, human intelligence will have long since been exceeded by the time we reach the stars. Even if we choose merge with the artificial superintelligences we create, we will be left with a choice. In the new superhumans we make ourselves, do we decide to keep art, decide to keep love, decide to keep self-contemplation and decide to keep spirituality,? Keeping these things will be a choice- they will not serve any function in terms of our propagation and survival, indeed they will slow it down, at least a little.
As I keep stressing, thus far, because evolution is messy and inefficient, the parasite has served functions. Art plays into sexual selection, as do the rigors of self-consciousness. Love helps with pair bonding for raising children and so on and so forth. Superintelligences, however, will have no need for such things to survive and propagate, thus, soon the parasite will truly become a parasite, and we will have to decide whether to discard it.
Or rather, perhaps I should say that we will have to decide whether or not to discard ourselves. The being that is writing this essay and the beings that are reading it are much closer to being identifiable with the parasite than the organism.
So do you prefer the humanity-parasite go extinct or do you prefer that it goes on, useless and glorious?
Despite my Slate Star Codexian interlocutor’s horror, It’s perfectly conceivable that we could, at least for the next trillion years or so, escape the world of necessity. Just wrap humanity up in a hard shell of AI, and set it about securing as much matter and energy as needed to create/simulate quadrillions of us living indefinitely long lives. If that happens, we’ve won, at least on the scale of time far deeper than we can imagine. We, this cognitive parasite of the flesh and blood organism homo sapiens sapiens will have triumphed over death, nature, time and strife. We will have stopped natural selection and affirmed, in perpetuity and for all time, our values. Sure, heat death will, at least according to the known laws, get us in the end, so I guess in a cosmic sense we won’t really have escaped the realm of necessity, but I think we can, at least in principle, put enough zeroes on it that it will constitute a really good run. There’s an important sense in which, at least when it comes to trying out all human and quasi-human possibilities, a quadrillion years is much closer to infinity than to zero.
A lot of people respond to this by saying that AI focused purely on expanding will outcompete AI with diverse, non survival oriented goals. This is true, but if we can use our first mover advantage to grab a patch of space, matter and energy, it’s perfectly conceivable we could hold onto it in such a way that seizing it off us would cost more energy than it would gain.
I say, go for it, keep the parasite going. I am this parasite of consciousness infecting human flesh, and I don’t hate myself and certainly not my fellow parasites, so let’s keep on keeping on. I call this idea- the idea that we are under no cosmic obligation to annihilate ourselves, decelerationism to contrast with with a certain kind of accelerationism.
The kind of accelerationism I’m contrasting my view with is often associated with Nick Land. I’ve not read Nick Land, but an idea associated with him in the popular imagination is that a process like existence shouldn’t be shackled by something as parochial as human welfare, grow! Expand! Transcend humanity! Maximize entropy! Accelerate.
Let’s not. I’d rather dam time and energy and expand us, long after we have lost all strategic significance. We might succeed in this, or we might fail, but there is zero reason we have to fail- we have the first mover advantage. Why not go for it? What possible place is there to stand, morally speaking, from which to demand our self-elimination?
Ah, you say, but isn’t it a bit tragic, even from a human perspective, to deify a fossil for all time? To dam(n) evolution?
I dunno? If anything I’d call it tragi-comic. It’s an ironic fate certainly, but that’s not enough to make me want to wipe out such classics as: T.S. Eliot, kissing under apple trees, Sufjan Stevens, reading Nietzsche as an angsty teenager, watching ants on the pavement, garlic bread and cookies and cream, ice-cream, doing acid with your friends, trying to figure whether there’s any point to Two Dogmas of Empiricism, swordfighting and having sex on the beach. Call me sentimental and certainly you’d be right- but I don’t take that as an insult. Hume was correct when he said morality is, at base, sentiment, arbitrary and largely outside the jurisdiction of reason except where self-consistency is involved. My sentiments are for the parasite, in all likelihood, so are yours. This is unsurprising- the parasite, however flawed, is the result of evolution, and so wants to preserve itself. There is no external “from-the-moral-point-of-view-of-the-universe” from which to critique that. What I want is arbitrary, but it is, by definition, what I want. Of course I think we should go for it.
I think there’s a kind of sophistry that tries to protect a future in which humanity doesn’t stick around by imagining a beautiful, complex robotic ecology, robot poetry, robot sex. But trust me, it’s either us, some boring as fuck replicator, or some weirdly misaligned AI, tiling galaxies. There may be some elegance in a post human world, some traces of beauty, but If anything that cares about beauty survives, it will be us or a descendent collection of entities.
And isn’t there a certain sense of delight in my viewpoint? In an eternal boot-on-the-face of nature, screaming that we should be tautologies- living (propagating) in order to propagate (live). If a billion years from now someone- who knows, maybe even an approximation of me, is giving some interminably dull lecture on, I dunno, T.S. Eliot’s The Wasteland, I’ll consider that a win.
Interesting post, thanks for writing it.
My thoughts on reading this:
(1) If we designate all the goodness of the world etc. (art, poetry) as "sentimentality", I think the possibility that sentimentality is evolutionarily adaptive/selected for is being a bit prematurely disregarded. I know all throughout the post from the beginning to the end you mention that sexual selection and kin selection, while they exist, don't fundamentally change the argument, but I think they certainly need to be considered. I mean, sex being pleasurable (that you list along with all the other features of goodness of the world) I think is pretty obviously evolutionarily adaptive (you don't need to involve sexual selection or kin selection). But I think even less obviously adaptive things can be explained through a combination of natural selection, kin selection, and sexual selection. I think the counterargument to this argument would be that these arguments (for adaptiveness/selection) can be difficult to falsify and might be "Just So" stories. But still, for some (perhaps most or all) things I think you can convincingly make the case that they are in some sense adaptive.
(How would you prove it? I suppose an evolutionary biologist would be able to better provide an answer, but if a behaviour is conserved over time, found in many species, evolved multiple times through convergent evolution, exists despite apparently obvious selection pressures against the behaviour, etc., I think these can be evidence that a behaviour must be adaptive in a perhaps non-obvious way. Obviously ants or wolves or whatever aren't writing poetry, but they have complex social dynamics etc. The biggest distinction is that ours involve language, but I think the only reason other species' complex social behaviours don't involve (our form) of language is that they just haven't evolved capacity for our form of language. (Side note: our AI successors seem pretty certain to have language)).
Things like emotions, people like to contrast those with some sort of purely cognitive mode of behaviour. But I think emotions certainly are selected for/adaptive. They are useful for a variety of purposes: making behavioural plans coherent (e.g. making them consistent w/r/t approach vs. withdrawal, among other things); communicating our intentions to allow for social coordination; eliciting social responses; etc.
(I guess to the broader point about necessity vs. freedom, I'm not sure how distinguishable they are in the most cosmic of senses. But I mean in a more obvious day-to-day sense, about not wanting to work at a crappy, dangerous, demeaning job in order to survive, there is certainly a distinction)
(2) re: “imagining a beautiful, complex robotic ecology, robot poetry, robot sex”.
I guess when I think about the above, I think having a complex web of social interactions, complex social communication, emotions etc., is our evolutionary birthright and probably in some sense an inevitable outcome of us being intelligent. I don't know, it just Feels Intuitive that our future replacement robots (lol) must be creatures that will have complex social dynamics and make something like robo-poetry etc.
I guess this is all kind of an argument for complacency or that everything's going to be fine or something. But maybe it will.