And things are slow as we waste all that processing power on running javascript one way or another. And everything requires a slow blocking connection to the mainframe. Nowadays the “always connected” mindset is really slowing us down.
Ok fair enough, but the mindset is spreading: json (javascript) parsing is what caused GTA Online loading times to balloon and I dread playing Call of Duty online as it wants to download and install dozens of GBs every time I launch it.
It wasn’t json parsing per se, but a buggy roll-your-own implementation that used a function (sscanf iirc) with a surprising nontrivial complexity on a long string. Fun part is, if they just outsourced that load to javascript and its JSON.parse, they’d never encounter that exponential slowdown. Javascript is a nice target to blame, but it isn’t the problem. CPUs got hundreds of times faster, javascript only divides it by N, which stays low and constant (at least) through decades. Do you really believe that if browsers only supported MSVC++-based DLLs without scripting, sites would run faster? That would be naive.
However there is definitely still less intrinsic optimisation from a dev perspective I think - people will iterate over the same array multiple times in different places rather than do it once.
I guess our industry has decided moving faster is better than running faster for a lot of stuff.