Hacker News new | past | comments | ask | show | jobs | submit login

A single company running AGI would suggest that something built by humans could control an AGI. That would actually be a great victory compared to the status quo. Then we'd just need to convince the CEO of that company or nationalize it. Right now, nothing built by humans can reliably control even the weak AI that we have.



All is this doomer-ing feels to be like it's missing a key piece of reflection - it operates under the assumption that we're not on track to destroy ourselves with or without AGI.

We have proliferated a cache capable of wiping out all life on earth.

One of the countries with such a cache is currently at war - and the last time powers of this nature were in such a territorial conflict things went very poorly.

Our institutions have become pathological in their pursuit of power and profit, to the point where the environment, other people, and the truth itself can all go get fucked so long as x gajillionare can buy a new yacht.

The planet's on a lot more fire than it used to be.

Police (the protect and serve kind) now, as a matter of course, own Mine Resistant Armored Personnel Carriers. This is not likely to cause the apocalypse, but it's not a great indicator that we're okay.

Maybe it's time for us to hand off the reins.


That we're on track to maybe destroy ourselves is not a good reason to destroy ourselves harder.


Not exactly what I meant; there is a nonzero chance that an AGI given authority over humanity would run it better. Granted, a flipped coin would run it better but that's kinda the size of it.


Right, and if we explicitly aimed for building a good AGI we could maybe get that chance higher than small.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: