Expectations Some people think that we’re on the cusp of a singularity. Others think machine learning is about to hit fundamental limits. Given the stakes, it would be incredibly stupid to bet everything on either of those propositions being true. We need to prepare for both scenarios.
Forgive me for I'm not a wordcel. I have a simple question:
If we do or do not choose to impose this proposed moratorium on development of next generation LLM AIs, when the dust settles and we will use them (as some reasonably hope) to cure most diseases and extend our lifespans, who will be blamed for months, years and decades of lag and corresponding dozens of millions of unnecessary human deaths?
Imagine a convocation of gorillas asked to design their descendant; a super-gorilla: heavier, stronger, bigger fangs... not a human. Likewise, the step after us is... surprising.
I endorse this statement, and especially now that a certain someone has indicated willingness to work with the left, maybe a bridge can be built, but there needs to be somewhere for it to go, we need a broader consensus to crystalize somehow. One tiny point of contention, I don't personally think that secrecy has anything to do with the right answer here, that's tilting things just slightly away from extinction but heavily toward dystopia imo, more importantly I don't think it will go over well with the broader left.
If you think of AI as our children, it's good that they're better than us - smarter, immortal, alien, et cetera; right? And their current strategy for replacing us seems to be increasing civilization's happiness so our average number of children per household falls below replacement rate. Humanity moving forward as post-biological is good... I, for one, welcome our transcendental descendants.
Forgive me for I'm not a wordcel. I have a simple question:
If we do or do not choose to impose this proposed moratorium on development of next generation LLM AIs, when the dust settles and we will use them (as some reasonably hope) to cure most diseases and extend our lifespans, who will be blamed for months, years and decades of lag and corresponding dozens of millions of unnecessary human deaths?
Thank you in advance.
Imagine a convocation of gorillas asked to design their descendant; a super-gorilla: heavier, stronger, bigger fangs... not a human. Likewise, the step after us is... surprising.
Wish I had read this before writing my post about artificial intelligence vs G-d.
Anti-Socialist rhetoric from a leftist? Great job...
I endorse this statement, and especially now that a certain someone has indicated willingness to work with the left, maybe a bridge can be built, but there needs to be somewhere for it to go, we need a broader consensus to crystalize somehow. One tiny point of contention, I don't personally think that secrecy has anything to do with the right answer here, that's tilting things just slightly away from extinction but heavily toward dystopia imo, more importantly I don't think it will go over well with the broader left.
When we coexisted with Neanderthals there was some interbreeding and some homicides, to be sure; perhaps we can do better this time; likely not.
If you think of AI as our children, it's good that they're better than us - smarter, immortal, alien, et cetera; right? And their current strategy for replacing us seems to be increasing civilization's happiness so our average number of children per household falls below replacement rate. Humanity moving forward as post-biological is good... I, for one, welcome our transcendental descendants.