> we examine the contribution of more computing power to better outcomes
No, they pick a set of problems where computational methods are known to have a beneficial impact and the plot every progress in that field against increased amounts of computing. Since amount of computing power used is monotonous and ELO score/Go performance/weather prediction success is trending monotonous the correlation is pretty high. However computation power is not the only thing that rose mostly monotonically during that time. At best they derived an upper bound of the contribution of more computing power to better outcomes.
For example in Mixed Integer Linear Programming studies were done to measure algorithmic vs hardware speedup. "On average, we found out that for solving LP/MILP, computer hardware got about 20 times faster, and the algorithms improved by a factor of about nine for LP and around 50 for MILP, which gives a total speed-up of about 180 and 1,000 times, respectively." https://arxiv.org/abs/2206.09787 This methodology would attribute the 1000 times effect to the increase in FLOPs alone.
And just a methodological concern, taking the logarithm of one axis, is applying a non-linear transformation, and then doing a linear fit results in distorted measure of distance between fit and data depending on the data. This effect was not discussed. It does only mess with R value so i would not feel comfortable applying that R value to derive an attribution.
To the users of MILP libraries, the difference between algorithmic and hardware improvements are not so significant, and both can be called "Computing Power" in terms of number of application-specific solutions per second.
You're right, and your insight is illuminating into the real gains had by the world on the backs of hardware and software improvements, but I think you're one level deeper in the abstraction than the paper intends to be.
Then i must have phrased it badly. The fact that all the speed up would be attributed to FLOPs in the MILP case with that methodology means any improvement in data acquisition, management methodology, mathematical modeling, the ability to dig deeper or even the increased regulatory cost of boring which taken together sorta look linear when logarithmitized (for weather prediction or oil exploration (after a dubious data selection)) would also be attributed to computational improvement.
No, they pick a set of problems where computational methods are known to have a beneficial impact and the plot every progress in that field against increased amounts of computing. Since amount of computing power used is monotonous and ELO score/Go performance/weather prediction success is trending monotonous the correlation is pretty high. However computation power is not the only thing that rose mostly monotonically during that time. At best they derived an upper bound of the contribution of more computing power to better outcomes.
For example in Mixed Integer Linear Programming studies were done to measure algorithmic vs hardware speedup. "On average, we found out that for solving LP/MILP, computer hardware got about 20 times faster, and the algorithms improved by a factor of about nine for LP and around 50 for MILP, which gives a total speed-up of about 180 and 1,000 times, respectively." https://arxiv.org/abs/2206.09787 This methodology would attribute the 1000 times effect to the increase in FLOPs alone.
And just a methodological concern, taking the logarithm of one axis, is applying a non-linear transformation, and then doing a linear fit results in distorted measure of distance between fit and data depending on the data. This effect was not discussed. It does only mess with R value so i would not feel comfortable applying that R value to derive an attribution.