People still haven't completely worked through the fact that Moore's Law has died. I think the anger phase was the late 2000's, when quantum computing, etc. was trotted out to denounce anyone noticing the slowdown.
It seems to me that there is still some progress to be made in Moore's Law. Maybe not directly by shrinking the manufacturing process, but:
AlphaGo requires huge amount of energy and CPU power to accomplish what human brain does in 20W, just using a smallish portion of its capabilities. There must be a plenty of undiscovered architectural improvements to computers that can still kick the Moore Law for a couple years.
And that's exactly what the parent commenter is saying. Moore's law actually wasn't about transistor density in the beginning, it was changed to this definition when they realized transistor density is the only thing increasing exponentially.
Well, Moore's law is stupid however it is defined, and I do also hope for big benefits from architecture (interface and implementation) work. So I guess I shouldn't be picking an argument :).
[As an aside, I had thought that Moore's law was originally transistor density, and the hype machine spun it into performance. But like I said, Moore's law is doomed regardless, so whatever.]
This looks like bargaining to me.