I mean, if you believe that AGI=ASI (ie. short timelines/hard takeoff/foom), the transformation will happen regardless of the social system's ability to catch up.
It's not a matter of any social system, it's a matter of hard physical limits. There is literally no hard takeoff scenario where any AI, no matter how intelligent, will be able to transform the world in any appreciable way in a matter of months.