I think your point boils down to "vendor's edge cache are too far away and should be at the city level", right?
If so, major cities already have local Netflix caches in many ISPs, for example. Same is done for Google, Facebook and others. It takes peering contracts, colocation, etc, but it exists extensively.
Maybe you're arguing for government-sponsored caching infrastructure?
It annoys me to no end that the idea of metro-ISPs never took off. If somebody in the same city as me wants to connect to any services I host at home they first have to round-trip to my ISPs core-routers and then back to me.
Now this all happens very quickly within 5-6 ms but it could be 1ms or less in an ideal world.
I understand why we cannot have nice things though (cost mainly).
Well, sort of. I feel that something like Interplanetary File System (IPFS) should have been the original web. Or at least, after BitTorrent arrived, we should have all moved to content-addressable memories because they automatically handle caching and scaling. That would have required solving circles of trust with potential improvements to SSL certificates so that they don't require a central authority, like a distributed version of letsencrypt.org. It also would have needed improvements to DNS for making domain-data pairs instead of domain-IP. And at the very least, IPv6 should have improved UDP so that it "just works" everywhere and doesn't have to deal with NAT hole punching. That stuff represents the real work of rolling out a true P2P internet.
Since we did none of that, we cemented the centralized web we have today, full of walled gardens funding the copyright nanny state. So that people are so aghast at seeing the real internet running on stuff like TikTok that their first instinct is to ban it.
To me, it feels like we're living under a bizarro version of the internet, so far removed from its academic/declarative/data-driven roots that it's almost unrecognizable. Javascript is just distributed async desktop programming now, reminiscent of the C++ days of the late 1980s, complete with all of the nondeterminism that we worked so hard to get away from. Paid services like Cloudflare and 5G are a symptom of that inside-the-box thinking. If we had real distributed P2P through our routers and cell phones, we probably wouldn't need ISPs today, because we'd be accustomed to gigabit download speeds and would only perceive long distance as slow and worth paying for. In which case, maybe the government should pay to maintain the backbone and perhaps satellite internet like Starlink. Which would be a good thing IMHO, because at least there would be oversight and we wouldn't have to worry about billionaire narcissists making the satellites highly reflective and visible to the naked eye, like carving a McDonalds sign on the moon.
If so, major cities already have local Netflix caches in many ISPs, for example. Same is done for Google, Facebook and others. It takes peering contracts, colocation, etc, but it exists extensively.
Maybe you're arguing for government-sponsored caching infrastructure?