Moore's law, strictly, is about the growth of transistor density. Koomey's law, strictly, is about the improvement in computation per unit energy.
Those are both interesting, but frequently people care about something different from either, which is something like "computation per second available in hardware of reasonable size, power consumption and cost". Call this "effective performance".
This can increase even if Moore fails (e.g., we find good ways to exploit parallelism, and build larger devices with more cores). It can fail to increase even if Moore holds (e.g., we can put more cores on a device of the same size, but we aren't good enough at exploiting parallelism so real performance doesn't improve).
It can increase even if Koomey fails (e.g., we find ways to make our hardware faster; there's a corresponding increase in power consumption but we are still able to cool things well enough so we just accept that). It can fail to increase even if Koomey holds (e.g., we can't make anything faster but we find a way to maintain existing speeds at lower power; very nice but no performance improvement unless power consumption is the current bottleneck).
It used to be that effective performance increased exponentially at a fairly consistent rate. This increase has slowed but not stopped; it's not obvious (to me, anyway) what we should expect it to do in the nearish future.
The consistent exponential increase in effective performance had a name, in popular discourse. It was called "Moore's law". It's unfortunate that strictly speaking it isn't what Moore was originally describing, leading to an ambiguity when people refer to "Moore's law" between a law about density and a law about effective performance.
(I unfortunately lack the ability to read minds, so I can't be sure what OP had in mind. But given the statement that "it has big implications for the advancement of AI technology", it looks to me more like effective performance than density.)
Moore's law, strictly, is about the growth of transistor density. Koomey's law, strictly, is about the improvement in computation per unit energy.
Those are both interesting, but frequently people care about something different from either, which is something like "computation per second available in hardware of reasonable size, power consumption and cost". Call this "effective performance".
This can increase even if Moore fails (e.g., we find good ways to exploit parallelism, and build larger devices with more cores). It can fail to increase even if Moore holds (e.g., we can put more cores on a device of the same size, but we aren't good enough at exploiting parallelism so real performance doesn't improve).
It can increase even if Koomey fails (e.g., we find ways to make our hardware faster; there's a corresponding increase in power consumption but we are still able to cool things well enough so we just accept that). It can fail to increase even if Koomey holds (e.g., we can't make anything faster but we find a way to maintain existing speeds at lower power; very nice but no performance improvement unless power consumption is the current bottleneck).
It used to be that effective performance increased exponentially at a fairly consistent rate. This increase has slowed but not stopped; it's not obvious (to me, anyway) what we should expect it to do in the nearish future.
The consistent exponential increase in effective performance had a name, in popular discourse. It was called "Moore's law". It's unfortunate that strictly speaking it isn't what Moore was originally describing, leading to an ambiguity when people refer to "Moore's law" between a law about density and a law about effective performance.
(I unfortunately lack the ability to read minds, so I can't be sure what OP had in mind. But given the statement that "it has big implications for the advancement of AI technology", it looks to me more like effective performance than density.)