Also by all the other human beings placing limits on what any individual, however smart, can do (i.e. taking over the world). A superhuman AI is going to come into a world full of existing intelligences, some of them machine which may be close to super intelligent already, along with the billions of humans and all their organizations.
I think the existing intelligences and organizations won't matter, the super intelligence will be a step change like the difference between a human and an ant. If the goals are not aligned up front it'll be a problem.
To put it into perspective if you can run the human architecture on an AGI at a billion operations per second it's like compressing a historical human civilization of learning into a couple hours.
Maybe for some reason this learning process will be slow or maybe there's some reason why it can't scale up quickly, but based on the learning speed in narrow areas that seems unlikely.
There's also the fact that natural selection which is a relatively brute force sexual selection mechanism still results in general problem solving brains being everywhere - it doesn't seem like something particularly rare.
> run the human architecture on an AGI at a billion operations per second
Maybe a single human, but can you run an entire world civilization and it's environment (basically Earth) in a sped up manner? Because one super fast human-level intelligence is still constrained in a way that all of civilization is not. What is one human sped up a billion times? Is that 1 billion humans? We already have several of those units operating around the clock. And those units have access to the world's resources, whereas one sped-up mind will not, unless it can gain control of the world.
I said human architecture - so human like ability to solve general problems sped up without the other human constraints (like needing to eat) and focused on solving a problem. It isn't like a billion independent humans just doing random things - maybe if they could all focus and coordinate on a single goal but the communication overhead from that would still make it different in addition to the other biological constraints.
You may not need that much access to the world to learn/infer a lot about it [1] and if a superintelligent AI needed access to more in order to achieve its goal it'd probably be able to get it.