A professor of mine who worked on transactional memory and speculative execution in the Intel processors said the notion of a clock doesn’t generally hold any more. He has a story of when he showed up and asked for a timing diagram and they laughed, one had not been possible for many years at that point. Cycles can take a variable amount of time depending on various non deterministic attributes including speculation and out of order execution under speculation. My understanding is cycle counting degenerates to some normalization of time to make how long instructions take to be generally comparable and stable. But this isn’t my area, so I may just be talking out my ass.
Given that a constant clock speed is practically a security vulnerability in terms of EM emissions, it's not so surprising that determinism is out the window even at such a low level. Thanks for sharing