Note that you loose 1-2 orders of magnitude just by the screen resolution (in the early nineties we used 640x480, sometimes even 320x200 vs 4K nowadays). Another order of magnitude is lost by using a more high level programming language, which sure you could probably optimize down to factor 2-3. Plus sublinear scaling of stuff like memory latency since the 90s. All in all I agree we should be able to do better, but it is by no means trivial.