Discussion about this post

User's avatar
Alex's avatar

So here's one more comment in evidence against the fact that no one reads these posts. Thank you (and I mean this genuinely) for an excellent and thoroughly depressing read. It did really hit like a ton of bricks in the face, but that's a good thing, for all the little to none difference it will make to things overall.

You are absolutely right that a lot of the attention, particularly in the rational/lesswrong community has been focused on the technological path to superhuman AI and alignment issues (I arrived here via a link from one of those posts) and the attention of us readers/lurkers along with it. Which is in a way justified from and x-risk pov but it does make us somewhat forget that there will be things coming out of the AI sphere that we have to deal with as a society which are far more likely to go down during our lifetimes, some of which are, as you say, ready to go today.

I wish I had something more poignant or relevant or at least positive to add. Maybe I will once I've digested things more. I've been visiting parents in Romania over the past two weeks, and after reading this article I took a mental walk back through what I've seen and experienced. All I can think is what chance do we stand at rationally making the right choices and preparations as a society when an overwhelming number people here are still selling their votes for a bag of cooking oil and a few kilos of flour and can't really wrestle with any concepts more complicated than making ends meet due to the compounding effect of lack of time and lack of education. How do you even begin to explain to people that in 5 years time the entire world could be a radically different place because of the speed and nature of technological change and more so, how would you get them to care about it versus what's hurting them today?

Your suggestions in terms of what we can do are very good and I'm ready to support them wholeheartedly but I really can't shake the feeling that it will, once again, be too little, too slow, too late.

Expand full comment
Ira Allen's avatar

Excellent, start to finish: thanks.

Three thoughts:

1. From the perspective of the majority shareholder class, we are *already* in a world where the gross majority (say, 80%?) of the human population is *un*necessary. Serviced by 15% of the population and using only current technology, the top 5% of wealth-holders globally could live functionally almost identical lives to those they presently live--including internecine competition--if the other 80% of us were gone. This is a unique fact of global human history. That bringing such a state of affairs about would "solve" the climate crisis, and that questions of competitive advantage rather than moral tissue stand in the way of collusion toward it only add to the threat this reality poses to the rest of us--now, and in your 3-5 yr near-term.

2. Your definitions of socialism, both vector and minimalist, have also to include some sense of collective control over what *counts* as social welfare. Else technocratic managerialist liberals--the very people most likely to usher in a durable authoritarianism--could with reason claim to be "socialists."

3. The production of bot-free-from-the-jump social networks is technologically feasible (I nearly registered a few nobotly. domains just now, but decided I couldn't be assed) through a combination of live, real-time-only membership uptake and periodic biometric check-ins (++ as workarounds evolved, obv). In the nearish-term future you describe, I suspect many people would be willing to trade (more) biometric data in exchange for the plausible assurance that their digital social network includes only human people as member-entities.

Thanks again for writing--an excellent read.

Expand full comment
7 more comments...

No posts