> * it will become standard for people to keep learning
That won't happen, humans don't work that way and it ignores the reality that many people will simply be incapable of learning enough to be more valuable than a machine. Machines will put a floor on value and people who can't exceed that floor (which will be most of society at some future point) won't have jobs they can do.
This is fundamentally an issue with the educational system. Change the system the rest will follow. Saying that humans don't work this way is like saying that nothing heavier than air will ever fly.
> incapable of learning enough to be more valuable than a machine
I don't think that AI and human intelligence will ever completely converge.
> Machines will put a floor on value and people who can't exceed that floor (which will be most of society at some future point) won't have jobs they can do.
I think that this breaks down to your interpretation of the world. I think that the world is a "fractal". Like notice how much specialization is going on.
There is, I think, ultimately a problem with having to unlearn old skills which is difficult. If nothing else, you end up with a workforce in a particular area, say, if retrained, who bring in very widely divergent backgrounds on how to do something.
This might not be all bad if your goal is to mix and match methods and come up with something better. But that's rarely a firm's primary goal. Rather it's to have a uniform workforce, performing a uniform process, producing a uniform product.
As you climb up through the complexity of tools, tasks, products, and processes, this seems to increase. I suspect it hits the tech world rather hard (it may be a reason for the intense battles over editors, languages, frameworks, operating systems, revision control, etc., etc.).
That is: the problem isn't the educational system, it's us.
(There are other reasons to believe that education isn't a driver of economic growth, including the rather inconvenient fact that education seems to trail economic growth rather than lead it.)
I'm of several minds on AI, but a thought intruding increasingly on my awareness is that it's quite likely that programming will be among the first-affected employments. The training set is intrinsically amenable. Wether this proves an augmentation or a replacement remains to be seen.
> There is, I think, ultimately a problem with having to unlearn old skills which is difficult.
Picking up new skills at a level that puts you in competence with AI is just as difficult. As you age and your responsibilities increase and mind plasticity decreases, your capacity for learning decreases almost asymptotically.
You simply cannot make 40-50-60 year olds learn new skills and expect any ROI on it.
Most great ideas generally run face flat into the big wall of human biological limitations.
To take a fairly trivial example, in driving, when braking on a wet or slippery surface, in an age before antilock brakes, the recommendation was to pulse the brake pedal, to keep the wheels from locking up. With antilock brakes, the advice is to apply firm, steady pressure -- the ABS itself will apply the pulsing far more rapidly and accurately than any human could.
That's an example of skill-unlearning which you're demanding of the hundreds of millions of drivers who've had decades of experience with the former method.
I could toss out numerous other examples -- practices, tools, techniques, rules of thumb, etc., which, having already been learnt, are now counterproductive.
The young have the advantage of only needing to learn something. The old have the task of both unlearning and learning. Interestingly, that's a point that Alvin Toffler made in Future Shock, a book I've yet to read completely, but have skimmed in part.
Even with equal brain plasticity, the more experienced person is at a disadvantage.
> This is fundamentally an issue with the educational system. Change the system the rest will follow. Saying that humans don't work this way is like saying that nothing heavier than air will ever fly.
No, it isn't. Check out a book called the Structure of Scientific Revolutions to see how humans really work; even the best and most intellectual of us age and die clinging to old ideas we just can't unlearn. Any future that relies on most people continually learning new and ever changing skills is a future only possible in fantasy.
> I don't think that AI and human intelligence will ever completely converge.
They don't have to, what I said applies even without strong AI. This notion that AI is only disruptive when it reaches human level is not well thought out.
> I think that the world is a "fractal".
I think that statement carries no real meaning; I could snazz it up and say I think the world is a quantum fractal, but it's still just technobabble with no meaning.
That won't happen, humans don't work that way and it ignores the reality that many people will simply be incapable of learning enough to be more valuable than a machine. Machines will put a floor on value and people who can't exceed that floor (which will be most of society at some future point) won't have jobs they can do.