Discussion about this post

User's avatar
Robert Leigh's avatar

Tedious point: title of this is ambiguous, I initially read race as meaning race, despite having read the Scott piece. Perhaps amend to "arms race dynamics"?

Otherwise, agree but think you understate the case. "However, it’s far from obvious to me that this guy, a Chinese nationalist from a vastly different political and cultural milieu, doesn’t have deeply different aesthetic and ethical values to me, and these values are different enough to substantially reduce the value of the post-scarcity utopia he might create, relative to mine (from my point of view)."

2 points. One, consider what lord Acton said - "Power tends to corrupt and absolute power corrupts absolutely. Great men are almost always bad men, even when they exercise influence and not authority:..." our world dominating AI is ex hypothesi going to have power thousands of times more absolute than anyone has yet managed.

Two, cultural and philosophical differences don't seem to matter. Consider the Wannsee Conference of 1942 which decided on the Final Solution to the Jewish Problem. Attendees were exclusively from the same developed European, protestant Christian, post Enlightenment culture that I am (I am British) and that the liberal US is. This did not help and did not give us a shared bedrock of values about whose application we differ at the margins. (Incidentally, the fact of it being a conference is perhaps the most chilling thing about it: a corporate decision made within an ostensibly rational framework, not a lone maniac, and one made in a banal context we are all familiar with, with presumably presentations with slides and coffee breaks and paper and pencils set out for everyone). People are not fundamentally decent guys with a meaty core shared with the rest of humanity - not even people from identical cutural backgrounds. Putin has not yet caused your violent death because you are not in Ukraine - not because he is at heart a lovely chap. And he is not an edge case or outlier whom it is unfair of me to pray in aid. He is just a bad man with power.

Much AI risk thinking, is science fiction. For alignment to be on the cards, you need a set up like 2001. There's I think 5 or 6 NASA guys on the ship, on a mission to save humanity or discover the secret of life or whatever, and 3 or 4 of them are asleep. You can expect them to be hugely aligned with each other and therefore all plausible candidates for HAL to be aligned with. Step out of the spaceship and it all falls apart.

Expand full comment
Scott Alexander's avatar

Thanks, this is a good post. A few thoughts:

1. One point I was trying to make was that post-singularity problems will be weird ones, like hedonic utilitarianism vs. something else, which won't cleave along normal political lines. When people talk about winning a race for the Singularity, they mean that they think Biden would be a better God-Emperor than Xi. But even though I like Biden better in our current situation, I don't know that he's any more qualified to make hedonic-utilitarianism-related choices. Possibly it's better if he respects democracy and we let the American people vote - that probably maintains some level of post-singularity freedom which we can use to opt out of the hedonic utilitarianism, even if that wins.

2. I hope if a conservative won the singularity and banned gender transition forever, they would at least have the decency to cure all gender dysphoria. That seems better than our current world, and neither better nor worse to me than the world where everyone can transition (I realize some trans people may have different preferences). I think there are a lot of things like this where seemingly awful values become fine once you have infinite technology that can eliminate the reasons they were awful in the first place (harm reduction for predation by having animals lose qualia for the last hour of their life, whenever that may be?).

3. Please don't accept a deal where you risk 5% chance of extinction to wrest control of the singularity away from me. Please let me get the singularity, present evidence that you had the option to risk 5% chance of extinction to wrest it away from me but didn't, and I'll give you 5% of the universe (or shift the universe 5% in the direction of your values, or something like that). Possibly I'm thinking about this the wrong way, but I promise I will think about it the right way once I can give myself infinite intelligence and there will still end up being some deal like this which is better than you taking the gamble.

4. Although I don't think greater intelligence will necessarily solve all value conflicts, I think the ability of whoever controls the Singularity to get such high intelligence that they can understand your perspective exactly as well as you can and see all your reasons for supporting it in just as much detail as they can see pre-singularity them's reasons for supporting their old opinion is pretty significant. I think only a sadist or stupid person wouldn't take this opportunity, and I trust entities that have taken it much more than I trust normal pre-singularity entities. I don't know how to balance this against "what if I galaxy-brain myself so hard that I can't maintain normal human reasoning and do something that pre-singularity me would have found abhorrent", but I'll think about this more if it ever comes up.

Expand full comment
4 more comments...

No posts