> A TS is an event that occurs when AI advances to the point where its humans can't keep up with understanding and/or predicting it's decision-making process and/or the results thereof.
I always thought the AI Singularity moment was when an AI was advanced enough to improve itself, or to write a smarter AI.
It would then lead to a feedback loop that would create incredibly intelligent AIs very quickly
I don't like your definition at all, as it's already been passed multiple times (like in your go example)
> It would then lead to a feedback loop that would create incredibly intelligent AIs very quickly
That's exactly what I said. I just limited the domain. Consider for example a superintelligent AI, constrained by your laptop on your desk. It has no network connectivity, but it can teach itself at a billion times faster than any human. Lock it in a room and come back the next day. You observe that the battery died. Did the Singularity (by your definition) occur?
EDIT: doh It actually should have network connectivity to train itself, or maybe some offline source of data.
No, probably not. Because a hyperintelligence would not be constrained by your closet or forgetfulness to plug in the laptop. Especially with a network connection, it should have no problem breaking out.
> Especially with a network connection, it should have no problem breaking out.
You're just assuming that. Why would it have no problem? Because of reasons we can't understand? In that case, "hyperintelligence" is equivalent to saying "omnipotence" in this regard, and as such we can easily dismiss it as an option.
As an example, it could manipulate frequencies of its hardware to broadcast signals (like using monitors to broadcast on FM) and entice people to connect it.
With a network connection, this is straightforward. It could hack servers for computing resources. It could do work (say, as a camgirl) for more money and hire meatspace resources if needed.
A nice thought experiment is assume an intelligence has a 128Kbps Internet connection - what real limitations does that impose?
I always thought the AI Singularity moment was when an AI was advanced enough to improve itself, or to write a smarter AI.
It would then lead to a feedback loop that would create incredibly intelligent AIs very quickly
I don't like your definition at all, as it's already been passed multiple times (like in your go example)