This has a "Bitcoin will solve finance" feel to it. Not to be super dismissive, but there's just as much likelihood it will help as it won't.
But in your corner:
It's true that right now space-based deployment of broadband will provide broadband in locations where it's not currently available - perhaps spurring competition.
It's also probably true that space-based broadband will be able to compete directly with urban / dense areas where Comcast & company have an effective monopoly - also a good thing!
But there's no reason to think it will put comcast/company out of business at all. Has cell-tower based intenet service at all displaced comcast? Verizon and T-Mobile canvas my neighborhood claiming to offer the best internet, but as far as I can tell, everyone still uses Xfinity, even if Centurylink is much, much better.
There's no phone company that made all other phone companies irrelevant.
There's no tv channel that made all other tv channels irrelevant.
And there's no radio station that .. you get it.
And so it's hard for me to concolude that there's some ISP that will make all other ISPs irrelevant. Honestly most people will probably have about a 50/50 split between home wifi and phone internet. And that home wifi might have significant intrusion from space-based ISP, but not completely. (I realize spacex is also hoping to provide some phone service from satellites, but I digress)
TV certainly didn't make radio irrelevant, radio is very relevant in situations where you can't look at a screen, or would prefer not to. It's also much lower bandwidth, takes up a lot less space, uses a lot less power on the receiver (even compared to say a modern mobile device) and audio only content is much cheaper to produce.
I've heard it argued that podcasts make radio obsolete. That is more plausible.
Believe it or not, Farnsworth, (inventor of the principles making Television possible) had no intention of it being used as an advertising or propaganda tool when setting out to figure out the problem. That's very much a "what people decided to do with it" thing. Bored farmer figures out thing, industry does horrible things with it is a tale as old as time.
It was supposed to be for recording and playing back images. That was pretty much it. His original aspiration was for it to be employed in educational contexts to help make tge spread of visual information possible.
Now if you're talking TV Broadcasting as an industry, then you're absolutely correct. That was the business model settled on, but not without some foundational hmmm'ing, haww'ing and experimentation in coming up with business models in the radio broadcasting space. The history is in itself an interesting piece of work to dive into.
Genuinely wondering, why are so many people on HN allergic to any mention of decentralized solutions? It was precisely the opposite in 2014.
Now, anything that smells of BFT consensus, open decentralized protocols, smart contracts, distributed computing, gets lumped with “Web3” and downvoted to oblivion. I was surprised Freenet got through the other day without being associated with it, despite having WASM smart contracts and all of it.
Shouldn’t “hackers” welcome building new, disruptive things, especially if they are open source and disrupt entrenched centralized solutions of the entseeking “establishment” cartels? I feel like an old grandpa on HN today, still embracing the “old” “Hacker Ethos” that was being promoted by YC when HN was in its early days.
Has hacker ethos really today shifted to unironically supporting closed, centralized solutions, and attacking most disruptive technology by deriding it, downvoting people who speak about it in any terms other than dismissive, and trying to make sure it doesn’t take off?
It feels a bit like the story of the political left and liberalism — once upon a time the goal of liberals was to make race a non-issue, for instance, but now that same attitude is considered racist by many on the left. Once hackers were anti-establishment in an “information wants to be free” way. To liberate “systems” from “the man” and bust them open. I remember it. It was still the case a mere 10-15 years ago. On today’s “Hacker” News, if you’d look across a large swath of reactions to projects which aim to do just that, you’d never know it…
Oh! Yes, “Communities getting together and making their own broadband” does sound like a decentralized approach and I agree that it's worth doing and certainly shouldn't be outlawed. I didn't see allergic reactions to that - but maybe I didn't look enough yet.
This sub-thread seems to be about using StarLink, which I don't see as any sort of "decentralized" given it is literally global and in effect controlled by one billionaire so hence my question.
another decentralizedish community solution in nyc is https://nycmesh.net/ which I used for years and volunteered on an install. biggest issue with nyc and why I can't use it anymore is you need line of sight to a hub node.
Is that a reasonable thing to ask, or are we supposed to simply nod along and say “yes, your dogma is correct in 100% of cases”?
For example tell us how these are a grift: FileCoin. UniSwap. Aave. DID and Sidetree protocol (used by bluesky sky).
And decentralized systems go way past blockchains. Email. Heck even the Web itself. If anything I’d say they tend to centralize because of the “grift” which is capturing open protocols and doing rent extraction (eg GMail) or surveillance capitalism:
I dunno, blockchains have been around for over a decade or more, and the stuff you listed might be cool. I don't know, because blockchain itself is such a toxic brand that I automatically disregard anyone and anything related to it.
It's like there's a swimming pool full of sewage, and sure there might be a new PS5 Pro floating in there, but I'm sure as hell not wading in to find out.
Im extremely jealous of this techno optimism perspective. In my view at no point has there been a decrease in monopolisation or increase in democratisation since the hopeful cybernetic 90's, so i just don't see this as good, more like some Tyrell or state owned corporate almost fascist dystopia where everything is owned and censored by the few, and just because it's "awesome and space based" only adds to the exclusivity. It's already leaked Musk is working for the US military, so it isn't some benevolent, utilitarian or even broad scaled project it's a project for world domination and enslavement.
If that happens, wouldn't it be potentially worse than the current situation?
Surely your local neighborhood isn't going to deploy its own dedicated satellite constellation.
If/when satelite broadband becomes widespread, it will likely be consolidated and provided by 1 provider (maybe 2?) which might end up resulting in higher costs for everyone all around.
It is very hard for me to imagine space-based internet ever being able to be competitive financially with an earth-based option outside of special circumstances.
Running an LEO satellite constellation is very expensive. But installing and maintaining a wire to everyone's house is also very expensive.
IMHO, it's a question of capacity and acceptable service levels.
Starlink monthly pricing is higher than my DSL and nearby cable as well as my local muni fiber. But it's not so much higher as to rule it out. And the install fee is a lot lower than muni fiber. It's going to take a long time to get ROI on my installation cost for the muni fiber if I compare monthly cost to Starlink. Muni fiber service level should be much better, but Starlink service would probably have been good enough.
Fair enough; I didn't read your comment very closely, I guess. Terrestrial radio seems a lot less expensive in areas with population. When you get out beyond rural and more into wilderness, satellite is likely cost competitive, because having a LEO constellation cover the wilderness comes at no additional cost, and having radio coverage in the wilderness involves setting up base stations with low usage --- you can do some things with longer range / larger cells, but it depends on terrain and maybe protocol limits? GSM had a concept of 'timing advance' that effectively limited the maximum cell size, but I don't know if that's a limitation if you're not using TDMA.
Security in outside plant isn't free either. We're a long way away from widespread outages because someone thought they could get high with the proceeds of a stolen satellite, but there are lots of disruptions when people take down pole mounted fiber because they thought it would be worth something.
They could be considered as infra provider in this case. Like for ground phone lines and required to lease those to competitive service providers. If this happens here will be a pressure from such providers to lover the price.
Or maybe, if we’re lucky, they’ll have to compete on quality as well as price.
When Verizon laid down fiber optic cable in my (former) neighborhood, it was so much better than Xfinity’s service, and everyone I knew switched.
Not to say that the type of physical cable matters as much, DOCSIS 4.0 is in the cards, but latency and bandwidth will always be better on a (good) wired connection.
One quirk worth mentioning, though, is that WWAN did leap-frog WLAN with 5G and especially with ultra-wideband (which isn’t everywhere). Until I installed 6GHz Wi-Fi, the fastest wireless speeds I saw were on ultra-wideband connections to my phone.
> WWAN did leap-frog WLAN with 5G and especially with ultra-wideband
> Until I installed 6GHz Wi-Fi
So after you upgraded your WLAN it was faster than your WAN until you bothered upgrading your WLAN.
That's a personal deployment decision not necessarily a matter of what was or wasn't' available at the time. >1G WiFi existed years before ultra-wideband was a thing.
The first phone to ship with ultra-wideband was the iPhone 11 in 2019, and the first phone to ship with 6GHz was the Galaxy S21 Ultra in 2021. The Intel AX210 was the first wireless card to ship with 6GHz at the end of 2020 and the first laptop to include it from MSI shipped in 2021.
Calling WiFi 6 >1G is a pretty far stretch. I don't think I've seen a real world test that breaks that barrier in favorable conditions, even with today's routers:
Whereas ultra-wideband was capable of pulling 2Gbps in 2020, something I have not even seen Wi-Fi 7 able to do. Ultra-wideband still has quite an edge:
> I've also tested 160 MHz channels, which quickly run into the ~930 Mbps TCP throughput limit of a gigabit Ethernet connection, but perform worse at range. I'll cover 160 MHz channels and 2.5 Gbps Ethernet in more depth when the U6-Enterprise leaves early access
So yeah, when the author limits the tests to <1Gb none of the results were >1Gb. Who would have expected those results.
I've had >1G WiFi 5 networks in operation since 2018. It didn't require 6Ghz.
Your equipment or your environment may not have enabled it, but it was around.
It might come even sooner as mobile internet becomes faster and gets lower in latency. I used to have Cox, and the service availability was so bad it was nigh unusable. Switched to TMobile home 5g and in some cases, it's actually faster. That shouldn't be a thing but here we are. I don't play twitchy games so latency isn't really a concern for me. By the time 6g or whatever it's going to be called is well established, I think we'll see the incumbents get blockbuster'd.
It takes 5 minutes of back of the envelope calculations to realize that this is a pipe dream. Starlink isn't a substitute for fiber internet, it's a substitute for no internet.
The missing piece is "buckets of Internet" where you pay a provider to have a connection, and they deal with whether it comes over 5G or fiber or Starlink or whatever. Nobody cares about HOW the connection is delivered, just the speed, latency, and reliability.
No. Satellite (except when they use lasers, but nobody is proposing space lasers) is a broadcast medium and wire is point to point. Wires will almost always be cheaper and more reliable and faster on a per-customer basis and have less interference.
Starlink has insufficient capacity for metropolitan areas. I don't think this is easily solvable, given the shared medium and the bandwidth constraints.
Let's try some back of the envelope calculation for providing broadband to the US via LEO satellites.
Initially assume everyone is spread out uniformly, and that we need a broadband connection for every three people, and that 100 Mbps counts as broadband.
If all connected were in use at the same time downloading at 100 Mbps we need an aggregate bandwidth of 11000 Tbps. Actual usage would be more bursty, so let's assume we can oversell by a factor of 100. That reduced the aggregate bandwidth need to 110 Tbps.
Wikipedia tells me the satellite with the most bandwidth currently are those in the ViaSat-3 constellation of geosynchronous satellites, each with a bandwidth of 1 Tbps. Let's assume we can get LEO satellites with that bandwidth.
We would need 110 to get the required aggregate bandwidth. However, since each satellite only spends roughly 1/8 of its time over the US we need to multiply the number of satellites by 8.
That brings is to 880 satellites, but remember, we were assuming a uniformly spread out population. We need to take into account the non-uniformity of the actual population.
To do that we need to know the ground area that a satellite can serve simultaneously. That depends on the height of the satellite.
Google is telling me Starlink's around around 350 miles up, so let's assume that is what we use too. A naive distance to horizon calculation says that a satellite at that height would have an horizon around 2500 miles away.
I'm not sure if that means it could actually broadcast to people 2500 miles away from whatever its spot is over or if there would be problems due to the shallow angle the signal would have to travel to the receiver.
If each satellite can effectively broadcast to such a large circle then most of the 110 visible at a given time from the US would be visible over a wide enough area that how the population is distributed might not matter much.
This is assuming that the satellites do not interfere with each other. My guess is that there would be interference and they would have to operate more like cellular networks operate, with each satellite having a limited ground area that it broadcasts into using some kind of spot beam.
Feasibility now depends on the size of those spot beams and how accurately they can be aimed. If they can adjust the beams to cover large areas when they are serving a rural area and small areas when serving a dense area, and if they can aim that spot anywhere in that 2000+ mile radius circle where the satellite is visible then 880 satellites may be enough.
If, on the other hand, their beams are pretty much fixed in size and direction, then we would probably need a lot more satellites. Given the area of the US, for everyplace to have coverage the 110 satellites that would be over the US on average would have to each be responsible for a circle of around 200 miles in diameter, and could handle the needs of 3 million people.
If you've got more than 3 million people in a 200 mile circle you'll need more satellites over that circle.
Suppose we have an area that needs several satellites. It might be possible to arrange orbits so particular places always have a lot of satellites so that you could cover that area without having to simply make your constellation bigger.
I have no idea if that is possible. If not you may have to increase you constellation size so that you every place has the number of satellites that your highest density 200 mile diameter circle needs.
I don't think that there is any 200 mile diameter circle in the US that would need more that 20 satellites, which would mean we'd need 17600 satellites in the constellation.
Note that this is quite dependent on the circle size so could be way off from reality. Also I've not considered communications from the ground to the satellite which might also impose constraints.
Conclusion: it doesn't seem obviously impossible to provide 100 Mbps broadband via satellite to everyone in the US. A more detailed analysis is needed.
The bigger concern is the latency. Even if some technical progress increases the bandwidth, latency is bounded by speed of light and couldn't be as low as cable.
The low-earth orbit generally means that this is not as big of an issue as in older satellite technologies. Currently if you are in an area where the satellite does not need to bounce your data through one or more other satellites before hitting a ground station they are in the tens of milliseconds of latency (20-60).
Comcast is generally in the 20-30 neighborhood, unless you are using their gigabit service, then it is more like 10-15 milliseconds.
So it is higher, but not debilitatingly so (unless your application is very sensitive to latency). I would imagine there is a lot more jitter in Starlink, but that is more a feel than real numbers.
Mainly on a terrestrial system you wind up bouncing through more substations on the way to the general Internet, whereas with Starlink you are generally bouncing straight from your dish to the satellite to the base station. So less bounces, but farther to go. The physics favors the wire, but not as much as it used to.
SpaceX apparently doesn't like to give definitive numbers (perhaps there are too many configurable tradeoffs and/or dynamic adjustments to variable environmental conditions for that to make sense), but the downlink throughput of a Starlink V1 satellite is reportedly about 20 Gbps over an area of several square miles. Putting that on a single fiber wavelength is mature enough to be firmly in SOHO/homelab territory these days (e.g. QNAP and Mikrotik both have 25Gbps switches that are about $1000). It looks like commodity CWDM mux/demux boxes can cram 16 of those into a single fiber. It's safe to assume that the big guys like Comcast and CenturyLink/Lumen can get equipment capable of far more than that.
AFAIK, latency sensitive games like CoD and Fortnite suffer badly on Starlink. Don't you think that would have a massive impact on adoption in areas where you can get "wired" internet?
We don't even need space based broadband for this. Terrestrial radio (5G and LTE) already will gradually drink the milkshake of most cable and fiber based ISPs.
Maybe if you have an antenna on top of your house? As it is, I regularly lose internet connectivity (2 bars or LTE or 1 bar of 5g) on my phone when I step into many buildings that are within .25km of highway 80 in a large CA city.
We'd also trade the fragments of resilience we have left against a single point of failure (both organizationally and infrastructurally).
Space-based internet makes total sense for very remote areas or as a bridge technology, as well as a technology to compete with incumbent terrestrial monopolies, but I'd hate to see fiber and terrestrial 5G rollouts stopping entirely in favor of it.