Your claims disagree with the facts on the ground, where latency is the main factor affecting page load time. This is the driving force behind CDNs, HTTP2, QUIC, and pretty much every speed optimization that people have been working on in the past few years. https://www.afasterweb.com/2015/05/17/the-latency-effect/
Your claim that your page loads faster also reeks of wishful thinking. Pretty much every AMP page I have loaded from a SERP loads instantly, not just fast. For someone on a worse connection, the page will have started loading before the user clicks on the link from near caches versus have not started loading at all from a far server. In the rare case where the AMP JS is not in the browser cache, it will be after loading the first result.
As mentioned, I've done testing on actual devices on actual high-latency low-bandwith connections, hundreds of times. That's the "facts on the ground".
If you say pretty much every AMP page you've loaded has been instant, please post the specs of the devices and network you've been using for testing.
Additionally, if the latency between the device and the nearest server is over two seconds, the latency to a far server as well as the click latency don't even come into play anymore at all, instead the number of connections needed becomes much more important, and bandwidth also becomes a much larger factor.
Your claim that HTTP/2 would have worked towards better latency on lower connections is also false, on bad mobile connections HTTP/2 actually increases latency, which was a major reason for QUIC aka HTTP/3 in the first place.
> As mentioned, I've done testing on actual devices on actual high-latency low-bandwith connections, hundreds of times.
And as I've mentioned, you've been testing the wrong thing by not understanding the whole point of AMP (safe preloading).
> instead the number of connections needed becomes much more important
A page preloaded from an AMP cache needs at most one TCP connection, usually zero if it uses QUIC.
> and bandwidth also becomes a much larger factor.
Which also works in AMP's favor because the device doesn't need to load your custom JavaScript or potentially unoptimized images, just the tiny HTML and optimized images above the fold. The weight of this (and the associated gain) is tiny, which is why bandwidth is a relatively unimportant factor.
> On bad mobile connections HTTP/2 actually increases latency
You're mixing up dropped packets with high latency. That's neither here nor there because Google's and Cloudflare's AMP caches both use QUIC — my point was that latency is the key factor that all modern web speed technology has attacked, including AMP.
Your claim that your page loads faster also reeks of wishful thinking. Pretty much every AMP page I have loaded from a SERP loads instantly, not just fast. For someone on a worse connection, the page will have started loading before the user clicks on the link from near caches versus have not started loading at all from a far server. In the rare case where the AMP JS is not in the browser cache, it will be after loading the first result.