Hacker News new | past | comments | ask | show | jobs | submit login
U.S. FAA clears 45% of commercial plane fleet after 5G deployed (reuters.com)
159 points by neom on Jan 18, 2022 | hide | past | favorite | 184 comments



From the most barebones news articles I read about the topic (and in this country they generally all are), this has already been rolled out in other countries to no great issue? Here it's a turf/jurisdictional spat between FAA and FCC over not having gotten all the approvals through the desired channels in time for rollout?

Although, I guess papers like this detail the issue more carefully: https://ecfsapi.fcc.gov/file/7021340930.pdf https://www.rtca.org/wp-content/uploads/2020/12/Slides-5G-In...


The FAA has noted that in other countries there were mandates for lower power levels for 5G and that the antennas be pointed down toward the earth so that they interfered less.

https://www.faa.gov/5g

Lower power levels and a downward tilt on the antennas would mean less range for C-Band networks so carriers would rather not do that - especially in the US where the population is much more suburban and rural than Europe.

It's possible that C-Band won't interfere even at higher power levels and even without downward antenna tilts. However, other countries have mandated greater permanent restrictions on C-Band than the US's temporary 6-month measures.


AFAIK it's not the power level, but 5G in the US has a sightly higher spectrum range that's closer to that of the radio altimeters. Finland apparently even uses power levels that are do high that they'd be illegal in the US without any problems.

Source: https://www.lightreading.com/5g/why-does-5g-only-pose-a-prob...?


The FAA doesn't cite that concern on their page, but having a 200MHz buffer in the US rather than a 380MHz buffer is a difference.

As noted in the article you posted, Canada is requiring antenna down-tilting nationwide as well as exclusion zones around airports.

EDIT: "an exclusion band is allowed ±10% around the band, within which the HIRF levels are very low—for radar altimeters, this spans 3.78–4.84 GHz"

https://www.rtca.org/wp-content/uploads/2020/12/Slides-5G-In...

So, in Europe, their C-Band ends at 3.8GHz and barely enters the range. In the US where we're going to 3.98GHz, we're well into that range.

It looks like Korea's band ends at 3.7GHz which is outside the range (and definitely by 3.8GHz which is where band N78 ends; the US is using Band N77 which ends at 4.2GHz, but we've only licensed spectrum to 3.98GHz)


200 is already crazy large. Super crazy. The original request from Boeing was only 100

These altimeters are dangerously defective if they need a 400mhz guard band. That is nuts


you're actually looking at the intersection of three problems:

* LTE is very noisy (it has echos in its sidebands for up to 100Mhz)

* low-frequency LTE is *loud* -- punch through mountains loud.

* These altimiters are doing the RF equivalent of looking for a needle in a haystream

While FCC Part15 says "accept any/produce as little as possible", much of this comes down to the physics of RF.


C-Band in the US is rolling out in 2 phases, the first uses spectrum from 3.7 to 3.8 GHz. The second - in roughly two years - adds the remaining 3.8 to 3.98 GHz. So for the first two years the spectrum used by C-Band 5G in the US is essentially the same as in most of the rest of the world.

https://www.commscope.com/blog/2021/maximizing-c-band-deploy...


I've read it's multiple concerns: https://www.faa.gov/5g

https://www.airlines.org/5g-frequently-asked-questions/

In Europe, the power output seems to be ~2.5 time lower, there's a bigger frequency separation in Europe, the 5G towers are placed further away from approaches, and the antennas have to point downwards.


Doesnt this show that some country's dont plan ahead that much or different elements of Govt are too disconnected to work cohesively?

Still, this chatter will no doubt give some loon some ideas on how to bring down a plane or mess around with the flight patterns at busy airports.


Radio spectrum was decided somewhere in the 1950s and as I understand it the US military owns most of it. It's difficult to plan ahead for future technology.


Oh I dont know, I think other countries seem to be making less of a hash of it, compared to the US.

And does it really matter if the US Mil own the spectrum? Thats like me saying the UK Govt owns the spectrum, they should still both be subjected to the court of law, if the laws are fit for purpose in the first place!


> Doesnt this show that some country's dont plan ahead that much or different elements of Govt are too disconnected to work cohesively?

This can be a feature, not necessarily a bug.

There are plenty of examples of government central planning and cohesion going very wrong.


I thought that AT&t and Verizon offered to do lower power towers near airports. The story from November seems to say as much.

https://www.theverge.com/2021/11/24/22801008/verizon-att-mid...


All cell phone towers seem to be aimed toward the ground.

I've flown a GA aircraft all around the lower 48, and only get a cell phone signal briefly when flying directly over a cell tower.


Naturally. Why would a cell carrier want to expend energy beaming signal into space?

That said, you do get the occasional misaligned antenna or weird bounce off terrain that gets you signal midflight for a minute.


How easy is it to report those type of issues?


What do you like to report? That cell signal was available somewhere?


make a phone call, and let the big data folks raise eyebrows when they see the elevation the call was made.


It's really regional. I get 5 bars pretty much anywhere below 5000ft in the bay area.


Yep, I get signal up to about 3000' AGL where I am.


Well, given that the possible range to a cell tower is about 35 km / 20 mi (max, an absolute limitation of how the GSM/timing is done), it's very plausible that even if cell broadcast antennas are generally pointed down + flat with the horizon, you're gonna get some signal from some antenna in the distance.


The US seems stuck in a holding pattern of "two more weeks, two more weeks" without addressing any of the potential workarounds like power, directional antennas, or even blackout zones. From what I've read, the airlines and FAA haven't really put forward any proposals for compromise, like leaving towers off near airports. It's always no 5G anywhere until they've had more time, as if the planned 5G deployment came as a total surprise. Or why can't the towers be turned off in bad weather only?


My read is either the FAA or the airlines or both want money re-allocated from the FCC spectrum sales or the carriers (or both) to retrofit their dated equipment.


So the airlines are cheap bastards, got it


yeah as someone that isn't following this super closely it feels like the airlines are on their back foot and unprepared. These radios are days away from being turned on permanently and they haven't done any smoke testing? Why weren't they working with the telecoms 12 months ago to run trials?

I'm not sympathetic to the airlines right now.


This is not the airlines problem: the airlines have type certified transport/commuter category aircraft with all kinds of radios - include the radar altimeters that are the problem here - that are required to meet exact performance specifications - which they did.

The FAA waited until December to issue airworthiness directives that limit the ability of the airlines to rely on radar altimeters under certain circumstances. The background here appears to be some kind of pissing match between US government agencies including the FAA, the FCC and the NTIA.

What exactly do you think the airlines were supposed to do here? Go out and have all their radar altimeters replaced? Replaced with what? The required standards haven't changed, and it's the FAA that writes the standard (technical service orders).

This is also not a "smoke testing" problem. This is a "on the worst day, in the worst set of circumstances, what is the degree of compromise of the assumed safety margins" question.


>it feels like the airlines are on their back foot and unprepared.

maybe the airlines should temporarily reassign the teams that redesign seating arrangements to reduce spacing to allow more people onboard. they seem very efficient at their tasks, and there's really not much left for them to do


Shouldn't be surprising to hear that the FAA doesn't really do anything with this stuff. They appoint a corporate-filled board of avionics companies to do all of the work for them, and then rubber stamp what those companies want.


It’s not all 5g, just a new band. They can figure out the technical issues without putting airliners at risk.


Maybe key people are out with Covid?


As a reminder, Turkish Airlines flight 1951 crashed in 2009 due to a faulty radio altimeter which set the autothrottle to idle before they touched down. This isn’t hot air or some sort of power struggle between FCC and FAA. Radio altimeter interference is a big deal.


From the report: https://catsr.vse.gmu.edu/SYST460/TA1951_AccidentReport.pdf

"Interference as a singular cause appears to be extremely improbable."

Interference is mentioned a few times in the report as likely not being the cause or at least sole cause. The only entity seeming to push interference as a cause was Boeing, which led to this remark:

"We are not aware that interference has been conclusively identified as the explanation for the radio altimeter anomalies. In fact, some airlines have found that their incidence of radio altimeter anomalies decreased when they chose to install the new gaskets. If the conclusions here are intended to be DSB conclusions, rather than conclusions of Boeing and the airlines, the text should be revised to indicate they are DSB conclusions." in which they struck out the text "The negative values of the radio height were an indication of interference"


> As a reminder, Turkish Airlines flight 1951 crashed in 2009 due to a faulty radio altimeter which set the autothrottle to idle before they touched down.

Was it due to interference? Otherwise this seems specious.


No, it wasn't, and I agree this is a specious example.

The Wikipedia entry [1] has a good overview, as does this Medium post [2]. In the flight 1951 crash there were actually 2 altimeters for redundancy, but due to human error they were configured so that 1 faulty altimeter affected the autothrottle:

> While on final approach for landing, the aircraft was about 2,000 ft (610 m) above ground, when the left-hand (captain's) radio altimeter suddenly changed from 1,950 feet (590 m) to read −8 feet (−2.4 m) height, although the right-hand (co-pilot's) radio altimeter functioned correctly. The voice recording showed that the crew was given an audible warning signal (landing gear warning horn) that indicated that the aircraft's landing gear should be down, as the aircraft was, according to the captain's radio altimeter, flying too low. This happened several times during the approach to Schiphol. The reason that the captain's radio altimeter was causing problems was the first officer making a mistake when arming the aircraft's autopilot system for a dual channel approach.

> The Boeing 737NG type aircraft has two autopilot systems, which can work independently of each other (single channel) or together (dual channel). Turkish Airlines' standard operating procedure at the time stated that all approaches should be flown "dual channel" when available, but the inexperienced first officer forgot to arm approach mode in the aircraft's mode control panel (which controls the autopilot), meaning that the aircraft thought the pilots wanted to do a single-channel approach using control computer A (captain's autopilot), which had a radio altimeter failure.

[1] https://en.wikipedia.org/wiki/Turkish_Airlines_Flight_1951

[2] https://admiralcloudberg.medium.com/test-article-the-crash-o...


Want to plug your second source admiralcloudberg as a brilliant blog that is worth reading if you're into airline crashes and system failures. He's very in depth and detailed in his writing

https://admiralcloudberg.medium.com/


Thanks to your plug I started reading. Pretty good resource indeed.

I note that the accident report on the Turkish airplane further confirms what we have since learned about Boeing:

* persistent issues with radar altimiters in the entire fleet

* issue classed 'low priority', even if it feeds into autothrottle

* near-crash results in 'fix' that disables autothrotle if two sensors disagree.

* 'fix' is not rolled out to the entire fleet. And it is not 100% effective.

* issue is not mentioned in documentation, pilots can not see which sensor is used

Is it me, or is this exactly the same history as MCAS? Drives home the famous mantra "if it's a Boeing, I'm not going"


Also watch Mentor Pilot on Youtube. https://www.youtube.com/channel/UCwpHKudUkP5tNgmMdexB3ow

He is very through and detailed, an instructor pilot with a lot of experience.


Wow, the use of a single sensor to control a safety-critical feature is reminiscent of the 737MAX MCAS-related crashes, nine years later. (I understand that the pilots should have set the airplane to use two sensors in the above case, and that in the case of MCAS, I'm not an expert but it didn't sound like that was an option in the initial version.)


I’m curious how many instances of the gear warning horn going off is about forgetting to extend the landing gear compared to the plane being about to have a CFIT. Maybe it would be wise to always treat it as a GPWS “pull up” command and only try to troubleshoot at a safe altitude.


We have GPS now combined with lots of other instruments. I'm not even sure radio altimeters are even used on many aircraft anymore.


GPS surely is not used for instrument landings, and especially not for altitude.


>> GPS surely is not used for instrument landings, and especially not for altitude.

It is. And increasingly so in fact, as it allows more airports that don't have the resources to install and maintain ground-based equipment to have previously unavailable approaches. https://www.flyingmag.com/everything-you-need-to-know-about-...

However, it doesn't fall into exactly the same class as ILS approaches.


Instrument rated pilot here. A GPS LPV approach is essentially the same as an ILS [edit for clarification: a category 1 ILS]. It’s true that an LPV approach is classified as non precision but in practice an LPV approach is pretty much indistinguishable from a cat 1 ILS, which is classified as a precision approach.


IIRC GPS LPV is at best considered comparable to ILS Cat1, and GPS altitude data is pretty much forbidden to be used for that (radio altimeter is required for any kind of low-visibility approach, iirc, plus it's required for TAWS).

In good conditions the approaches might be indistinguishable, especially given that GPS approach once on final will be usually a copy of GPS one, but the safety of GPS LPV is considered much lower, for a good reason.


Can't edit anymore, but of course I meant that GPS LPV approach on final will be a copy of ILS :)


Hm, that's interesting to me. Is it just regulators (FAA I guess?) falling behind the times? Or is the GPS version really not adequate for some use case that is only satisfied by ILS (not sure what that is but I assume it's the "gold standard" landing approach?)


GPS notoriously has poor altitude resolution. For this reason, GPS approaches are technically not considered "precision" approaches.

WAAS (GPS plus a correction signal transmitted by separate geostationary satellites) improves the resolution to CAT I standards - an "LPV" approach. However, it's still not sufficient for CAT II/III approaches.


You basically use the regular altimeter for altitude during such approaches. It's accurate enough to get to the decision height of a few hundred feet AGL after which you have to fly the rest of the approach manually anyway. That is true with CAT I ILS as well.

A CAT III ILS approach requires very specialized equipment and training because you basically fly it all the way down to the runway with no requirement for visibility until moments before touchdown.

However, GPS is a lot better when it is not being obstructed by buildings around you. Planes have the benefit of unobstructed paths to GPS satellites and would get a better quality and accuracy then you might get with a phone in a city. Also, using good antennas helps. So, the altitude accuracy would be fine mostly.


Fine mostly, until it’s not. Then maybe those last couple meters might count.


Fine as in common practice and the way aviation has been operating safely for the past few decades.


To count as a precision approach (which ILS is), the equipment and procedure has to meet certain accuracy requirements (e.g. <x> feet error at <y> distance/height from touchdown) as well as the ability for the equipment itself to alert the user that things are out of tolerance within <z> seconds of being detected faulty.

I.e. the transmitter on the ground must be able to detect that something is wrong and immediately cut out the signal / report it as inop to the receiver on the plane. When you're seconds from hitting the ground, that is a requirement.

Although GPS based approaches have some aspects of this, I believe that they just don't meet the exact requirements as written currently, although they are close.


Outside of the U.S., there are GPS-based precision approaches. I say GPS-based because they all require SBAS (space-based augmentation systems, like WAAS or EGNOS) rather than being pure GPS systems.

ICAO recognizes SBAS Cat I as a precision approach, and there are dozens of them in Europe (France, Austria at least).


There are different categories of ILS approaches (one of them is actually impossible to implement at this time, as it requires tech that is at best in testing), and most importantly GPS is not considered "safe enough". Most use in heavy aviation is due to easier operation (especially with autopilot) but in better conditions only (i.e. at best equivalent of ILS Category 1)


Airports served only by an RNAV (GPS) approach with LPV minimums have different alternate minimums (for planning purposes). That’s the only real distinction. LPV approaches are classified as non-precision due to an ICAO technicality over the lack of a ground transmitter at the airport. GLS approaches compensate for this.


It is not uncommon to get +/- 15m error on the vertical axis. There is a significant delay in the signal too, so can't be used for landing.


I think you're being downvoted because the imprecision you speak of is a result of: consumer GPS receivers having a limited polling rate, limited precision, and only using one GPS frequency. You can't extrapolate the limitations of GPS on say a phone, to limitations on a commercial plane.

See the following from gps.gov: it can be accurate to within centimeters and with a high polling rate, but most consumer devices don't employ the necessary tricks. https://www.gps.gov/systems/gps/performance/accuracy/


Military gets more time resolution as well as no deliberate clock skew. Airliners won't have military GPS. Gps receivers for surveying require on the order of hours to resolve to "centimeters" using civilian GPS.

The highest resolution GPS receiver I have still has +/-15ns jitter, which over the course of a day is good enough for maybe 12-20cm resolution, but the first hour the track is wild, 50-80 meters off in varying directions.

My GPS is used as a time stamp source for lightning detection and location, so in aggregate with other detectors it can pinpoint a strike to within a meter or so anywhere in my hemisphere.


This is why a recently calibrated altimeter is used for the altitude axis on GPS approaches. They also must have a constrained error margin to be certified, and it is factored into the buffered region around the approach.


> I'm not even sure radio altimeters are even used on many aircraft anymore.

What misled you to think this? Radio altimeters are used throughout the aviation industry. Their ubiquity and criticality is the issue at hand. [0]

[0] https://www.faa.gov/other_visit/aviation_industry/airline_op...


Yes, I’m sure that few meters of vertical inaccuracy will not matter at all during landing.


Radar altimeters are required for reduced minimums on ILS approaches.


Has anyone ever had awesome lte and said “I need more bandwidth” on their phone? Idk I just don’t see the need for 5g but I’m definitely not a phone power user.


It's not just about what exists today, but what we want to be able to do in the future.

First, there are situations where it's hard to get "awesome LTE" because there's a lot of usage. Getting more bandwidth is important to handle situations like that.

Beyond that, we don't know what the future will offer if we build greater capacity. When Apple launched the iPhone, we didn't know that we'd be changing the way we live our lives. Things like TikTok wouldn't exist without the abundant bandwidth offered by LTE. We don't always know what greater bandwidth and capacity will offer.

We are seeing some things already. We're seeing wireless home internet become available to a lot of people. For cell carriers, this is a new stream of revenue. For customers, this can be increase home broadband competition and for many rural customers the first chance at broadband.

We're seeing things like Nvidia's GeForce NOW allow you to play games that are being run on the cloud rather than your local device. Lower latency and greater bandwidth/capacity means a better experience. Nvidia recommends 50Mbps which LTE can do, but won't have as much capacity to support as many simultaneous users. 5G will also drive down latencies which are important for gaming.

It's possible that self-driving technology might take advantage of this in the future. I'm not someone that promotes self-driving tech, but one can see how lower latency and higher bandwidth/capacity could mean being able to send more data to the cloud both for computation and also for storage/learning.

A lot of people are talking about VR and meta-verses. Again, I'm not necessarily buying into this yet, but it seems like greater bandwidth and lower latency could make things better.

Heck, even video-conferencing on mobile can benefit from more bandwidth and lower latency. While LTE can accommodate it, there are times when networks get congested and less congestion is a good thing. Lower latency also offers a better experience.

Do you need it? No. But you don't need a smartphone. At the same time, it's possible that someone will create something that you won't know how you lived without that's made possible by more bandwidth/capacity and lower latency.


Totally agree. Should consumers get costly highend phones in 2020-2021 with degraded battery life due to 5G is different story, but maybe we should be sacrificed to build futureproof 5G infra by buying phones with unnecessary 5G.

3G(HSPA) to 4G(LTE) was huge. I happily upgraded.


I think whether it's currently unnecessary might depend on where you are and what network you're on. T-Mobile has mid-band spectrum that isn't C-Band spectrum and they've been able to deploy it sooner. They're seeing 300Mbps+ speeds on a lot of their network already. We're also seeing battery life get better fast. X50 phones were problematic. X55 phones had some degraded battery life. X60 phones seem to be doing pretty well.

I think it's easy to forget that LTE wasn't as big a jump initially as we see today. Back in 2012, T-Mobile's HSPA network scored 5.5Mbps in PC World's testing against 7.4Mbps for Verizon's LTE and 9.1Mbps for AT&T's LTE. In PCMag's 2013 testing, T-Mobile's HSPA got a similar 7.7Mbps while the 4 LTE networks got 5.1Mbps, 10.3Mbps, 13.1Mbps, and 16.0Mbps. Again, LTE networks were seeing marginal improvements. Over time as network deployments matured and new LTE releases came out, LTE's improvements went way beyond HSPA.

But the first round of LTE phones on Verizon were terrible - way worse than any 5G device we saw. Some people wanted to be a first adopter. I think waiting a generation or two gives you better results, but some people always want to be first.

I actually think the reason the US pushed LTE so quickly is simply because Verizon didn't have an upgrade path for their 3G network which was stuck in the 1Mbps range while HSPA-based competitors were pushing into the 4-8Mbps range. In Europe, carriers could delay a year and offer dual-channel HSPA service in the meantime.

LTE was still better even in the early days, but the fact that CDMA didn't have an upgrade path put Verizon in a tough place. They could either watch AT&T offer 5x faster speeds than their HSPA network (at a time when AT&T had an exclusive on the iPhone) or they could be an LTE early adopter. I think AT&T would have liked to milk their HSPA investment a bit longer (as many carriers did around the world), but once Verizon started marketing LTE they had to follow.

And you're saying that the upgrade was huge - and it certainly has been. However, it was more marginal at the start. That's not to say it wasn't a good upgrade even initially, but it wasn't anywhere near the 40-50Mbps that LTE averages today in the US. If 5G can follow a similar trajectory, we'll be able to support a lot of things we can't today.


There are plenty of cheap Android phones with 5G now. E.g. there are a few nice Nokia ones. I wouldn't buy a phone without support for that right now. This stuff was standardized and agreed years ago and it's just the rollout that is going very slowly. The same happened with 4G. That took many years.

The point of 5G is better quality of service. Marketing people of course dumb down the improvements to "it's faster" but throughput is actually the least important thing. Though of course a better QoS means that operators can guarantee a bit better throughput as well.

Better latency, improved scaling for operators (more connected devices in an area), lower power usage, and more reliable connections. That is a big issue for me with 4G. Coverage here in Germany is not great and even in areas that are covered, the quality of service is pretty bad and I often struggle to get it to work at all even in places that are supposedly well covered. E.g. the are around the TV tower in Berlin (i.e. the largest FFing antenna in the wider area) is pretty bad because the area is full of tourists usually and the local base station ins hopelessly oversubscribed.

That's the kind of stuff you could reasonably expect to improve with 5G.


> Lower latency and greater bandwidth/capacity means a better experience.

True but isn't wifi already lower latency and greater bandwidth? What things can we do on wifi that we can't on mobile data?


Wifi isn't really set up for authentication through carrier. It also doesn't seamlessly transition between short-range high-bandwidth and more traditional longer-range lower-bandwidth


"It's not just about what exists today, but what we want to be able to do in the future."

So a solution in search of a problem, then?


Not really. Just a prediction about the future and as the infrastructure takes years to build out better to start early than late.

Bandwidth usage globally has been growing at around 30% per year for a long time now with nothing showing it would slow down (obviously not all of this is wireless). LTE will sooner or later run out of bandwidth and thus 5G basically allowing more mpbs on the same radio spectrum (and also rolling in new frequencies)

Even today in relatively low population density country like Finland (where I live) LTE connections in cities are slower during peak hours due to the radio frequencies just being full. (In general most plans have unlimited data so people use them a lot)


It sounds like you just made the case for 4k TV, where LTE is analogous to 1080p.

The long overdue update came and went. There just isn't the pressure built up for the next one.

I get the idea of incremental capacity and quality updates. But it's drudge work. What business models are people contemplating that has them so horny for 5g?


Yeah I feel like that's an important aspect. We never are able to imagine at the time completely what's possible. It's only in hindsight the new thing seems obvious


> Things like TikTok wouldn't exist without the abundant bandwidth offered by LTE.

And this is valuable? If you dig a big hole in a public space people will fill it with garbage before long.

Technology should be driven by need. Building bigger pipes, bigger roads and bigger holes is just asking for more garbage to fill it with. Let's start with problems that are worth solving.


Entertainment is many people's primary use for their phone. You might not agree that it's worthwhile, but then that's often touted as a strength of the internet -- new and creative applications can spring up based on anybody's wacky idea, and either flourish or die. There's no need to justify their importance to anyone.


LTE was not invented to create TikTok, it was created for other reasons, it just happens that TikTok could fill LTE with garbage. Just all tech that depends on other tech does this.


Used a 4G modem for a year because I couldn’t get broadband at that address, speeds were patchy, ping was >100ms, “buffer bloat” was a nightmare, was CGNAT’ed, you couldn’t rely on keeping a tcp connection open for ages without having to reconnect (the buffer bloat and tcp issues _may_ have been my modem though.

Was it ok? Sure, you can stream video just fine (my mobile isp didn’t force 480p video). But for anything behind basic browsing and consuming content it was a pain for example playing multiplayer games.

I was very happy when I moved back to a fixed line connection. So yeah, I would have been very happy with more bandwidth, lower ping and a more stable connection.


I'm not sure that was what the OP was talking about - I think the question was "If you already have good 4g, would you benefit?" - aka isn't good 4g good enough?

Just out of curiosity I tested my 4g speeds/ping - I got 30mbps and an average of 35ms (6.4ms jitter) here in the Netherlands (KPN network). Given by test results I certainly think you could have had a better network.


The quality of 4g is largely based on how crowded the frequencies are.

So sure, "excellent 4g" will do a good job, but you still need to expand into the C bands to make that happen.

If OP was talking about 4g vs. 5g with everything else equal including frequencies, then it's not necessary but 5g is somewhat better with basically no downsides.


Speeds are highly variable depending on location and network congestion. I think "already have good 4G" is a bit of a weird thing to say given that 5G can mean increasing the chance that you have a solid connection.

But even with a good 4G connection, I think 10ms is better than 35ms and I think 300Mbps is better than 35Mbps. I think our current 4G networks work well, but higher speed and lower latency will improve the experience even for our current use cases and even when a person has a good 4G connection already.


But that's no better with 5G. In fact the majority of 5G deployments are on higher MHz bands than 4G right now. So if anything range with 5G is less than with 4G, meaning: unless they happen to have build a new tower near you your 5G reception will be < your 4G reception.


There are a lot of things in the picture, but 5G is offering big improvements in many situations and it's going to get better.

Yes, lower frequency networks offer better coverage. It's why we see sub-1GHz 5G deployments from the 3 US carriers. These networks are offering marginal improvements (say 30% at present, but probably more in the future). As 5G standalone (rather than anchored to LTE) becomes common, they'll also offer significantly lower latency.

A lot of of 5G deployments are likely to be from 2-4GHz which is higher than most 4G deployments. However, in a lot of cities carriers are already splitting cells to get more capacity so the higher frequency doesn't matter as much as you might worry. Carriers are also working on 5G carrier aggregation which will extend 2-4GHz coverage a lot farther by allowing lower frequency signals to handle the uplink. Towers can put out more power to push their signals farther so the limitation is usually the uplink. If that's getting handled by lower frequency signals, it will effectively extend the range of 2-4GHz networks.

Yes, 5G is better. It's definitely not fully realized right now with a lot of carriers putting out very basic 5G networks partly for marketing purposes. However, 5G offers better spectral efficiency and it does offer opportunities to extend the range of higher frequency signals via carrier aggregation.


When I tested 4G for home internet, It had 24ms-30ms+ - it was noticable for RDP session.

When I tested 5G, I got 1Xms (forgot exactly what, but lets say twice as less - maybe more). Anyway, latency was noticeably better (at least for RDP session). So that is single best improvement for 5G. Download speed was 320Mbit wired, 250Mbit wifi, but only 20Mbit upload - not so great.

Ofcourse it all depends on ISP, configuration, etc. When I first got modem, upload showed 10Mbit, complained to ISP, then it was 20Mbit, but depending on time of day 16-20mbit.

Anyway I don't understand the reasoning "why do we need...?". With that reasoning why bother going past 100Mbit ethernet? Oh, nice we got 1Gbit, so enough. I mean we may not saturate the channel, but it is nice for ISP to have the capability + some nice capabilities when we need that bandwidth (some huge transfer or whatever). + Streaming is data hungry, you may have multiple streamers at home.


Just tested mine (I still use that provider for my phone) 100mb down, 30mb up, 21 ping unloaded (I’m about 20 miles from the DC the server was located.). However once the connection was loaded down the ping shot up to 497ms.


I don't think that justifies the 90% reduction in overall range from 4g, but ymmv.


Are you looking at millimeter wave ranges there?

The range of 4G depends a lot on which band you pick, but these C-band frequencies should go several miles, maybe half as far on average.


Same could be said about 2.4GHz and 5GHz WiFi.


Running on the same frequencies as current 4G deployments, 5G should have very similar range. The massive reduction of range is only on additional frequencies available.


5G additionally uses the same frequency bands as 4G, with the same range, just better throughput/latency.


There are many categories of LTE UE beyond just phones. Applications like professional live streaming cameras or home & business internet (free from the tyranny of cable providers) could benefit from additional bandwidth. There are supposedly also additional sleep modes for 5G but I'm only familiar with DRX modes on 4G LTE. This would theoretically be important for small battery powered sensors, but again, short-range frequencies probably restrict this to devices that don't need to rely on batteries to begin with. Maybe smart watches?

Does it deserve all the hype? It doesn't really seem like it. But cell infrastructure is definitely used well beyond phones, so I'm trying to keep an open mind.


Other than general ideas of “the future”, having faster speeds allows current applications to complete their data transfers faster, which allows them to get off the radio faster, which means more space is available for other devices. So even at current workloads, you can handle more users in a single area.


This is what people said when LTE came in too. 3G woth HSDPA was fine. Now we sigh when we drop back to 3G. 5G is not just for today but for tomorrow :)

Also 4G made decent YouTube streaming possible. It worked on 3G but was horrible. Now people can't do without it. We won't even know what 5G will bring us until it does.


Honestly don't recall hearing that about 4G...to me it was this

2G: Slow WAP internet browsing 3G: Better internet browsing, maps, basic audio 4G: Stream video and better audio

5G:??

All the above were known beforehand because allowed for current apps which were on broadband to be used out of the house. I would be more convinced there would be everyday application if there was current technology that is on broadband that I can take remotely.


> current technology that is on broadband that I can take remotely.

I actually think that there are a couple of solid examples of this:

* Console/PC videogames: people wanna play God of War on their bus ride, as evidenced by the massive success of the Switch and the growing excitement for the Steam Deck

* High-quality video calling: latency and bandwidth increases in 5G are both huge for the experience of FaceTime/Skype/Discord/etc.

* VR/AR: again, applications that are seriously harmed by both latency and bandwidth, and ones that have many concrete, business-value use cases in a mobile / non-living-room setting.

and my "dark horse" candidate:

* Machine learning applications: any "AI"-powered product or service has a quality that is pretty directly proportional to the amount of iterations it can perform. Self-driving cars have the capacity to gather gigabytes per second of environmental data; the ability to stream this data back to a central e.g. Tesla datacenter, rather than just doing procession onboard, confers a pretty serious increase to model quality.


I saw this mention of self-driving cars a couple of times in these comments. I'm pretty sure you don't want your self driving technology to rely on 5G connection, otherwise a sudden unexpected drop in connection quality (an antenna down for maintenance, an unmapped tunnel, an exceptionally crowded area due to a demonstration with lots of people streaming video, etc.) would also reduce the quality of transmission and maybe make the connection just impossible. What would happen in that case?

I think a "local" network-free source of computing is unavoidable, and once it is available, I just don't see the point of setting up a complicated (and error-prone) infrastructure to make computations remotely when it is available and fall back on-the-fly when not. Given the amount of regulation this kind of things have to comply to, and how error-proof they must be, I wouldn't think anybody would be happy with this cascade approach...


I don't think the parent is saying that this remote computation is to be used for the actual driving, but that data can be sent home to have more usage patterns, to have data about edge scenarios and stuff like that.


my car might want to connect to traffic signals to get an idea of when they'll be turning green or communicate with the car next to me to let them know I'm changing lanes in front of them


Something like Stadia or Steam's remote play maybe?


I don’t remember ever saying that about 3g haha.


I knew what 4G would bring me before it did. With 5G I really don't know, I'm not interested at all.


No but I have said “I’d like LTE to not be horribly congested and unusable” when in high-demand.


I’m just happy to be able to switch from my $130/month cable provider with terrible service to a $50/month 5G provider, while it lasts. I’d honestly rather have a hardwire connection, but I am incredibly happy that competition has finally arrived.


5G is just an excuse to not run fiber the last 100 feet to your residence.

Basically the cell companies outbid the homeowners for control of the last 100 feet between the house and the nearest utility pole. Now the auction losers will proceed to get screwed.


Pre-pandemic, LTE was just too congested in busy city centres and other high demand locations in the UK.

5G will help alleviate the strain on LTE networks in the long run, so I want (and have) it for that purpose.


I doubt most of this is related to LTE. Other megacities seem to handle it relatively fine including in countries with similar mobile internet adoption. Meanwhile, in London some mobile operators at least in my travels seem to have nearly useless (slower than 2G) internet connectivity across large swaths of the central area.

I’m sure 5G will help, but unless I’m missing something I would suppose LTE could provide far better access in UK cities today.


Every home internet connection will be by 'mobile' internet, that's my prediction.

When a friend moved to London for six months, a 4G modem & router was the cheapest option, (not least because rolling 1month SIM plans are common here, but ISPs like twelve months, and charge exit fees or a premium for a shorter plan), faster than the cheapest ISP plans, and delivered next day vs. two weeks' wait for an 'engineer' to visit and mess about with 'master sockets' and 'lines'.


> Every home internet connection will be by 'mobile' internet, that's my prediction.

I highly doubt it. There's only so much capacity per base station, so the amount of base stations needed would be prohibitive in urban settings.

I honestly don't think this is a serious plan even by cell companies, except in rural settings. That isn't to say that they won't dupe some number of urban dwellers into it, too. Enough to earn some money, not enough to overload capacity.


> I honestly don't think this is a serious plan even by cell companies, except in rural settings.

I can't speak for their seriousness obviously, but EE (major provider in the UK) is advertising wait-list sign-up/'coming soon' in London.

I recall tube ads years ago (it was probably 4G, but certainly not 5) for a minor player that sprung up using someone else's network (Three/EE/O2/Vodafone, I don't know) specifically to provide home internet. I think they got killed by uncapped/high cap data becoming more common, tethering being more allowed; so anyone who cared could use any provider (and any modem) directly - and most people don't care they just want it to work at a good price and be easy.

I think EE will make it an opt-in up-sell for now, but that eventually they (and others) will just send SIMs & modems out instead of 'engineers'. No more copper lines to faff about with. If the customer wants a landline number, well that's just adding minutes on to their SIM plan. (And for a while they'll get away with charging through the nose for that, like the old £15pcm line rental or whatever, until the masses realise that any SIM would work, and most high/unlimited data ones have lots of if not unlimited minutes included anyway.)


Of course they'll try to lure in as many as they can. What I'm saying is that their plan surely can't be to have most people on this kind of service in urban settings. It just won't work!


It probably won't be every home, but certainly some percentage of customers could be covered by fixed wireless pretty easily enough. It then becomes yet another point of competition for wired network providers.

This was previously the dream with 4G/WiMAX, but ultimately they weren't able to have enough bandwidth per tower to make it reasonable. 5G has much better efficiency enabling way more clients per tower and far more throughput.

And its definitely a serious plan by cell companies. Several companies are already selling this. Obviously, they're not exactly planning every home, but enough to actually market it as a real service.

https://www.t-mobile.com/isp https://www.verizon.com/5g/home/


I'm struggling to see how this pays off for them. The amount of base stations they'd have to put up and maintain seems ludicrous. Surely it'll be cheaper, in a densely populated area, to build out fiber (which the base stations need anyway)!

But yeah, I don't doubt that they'll try to lure a good amount of people in. Enough to make some free money, not enough to have to actually invest more in the network. Kinda like how ISPs (at least in Europe) were heavily pushing ISDN onto people as we stood before the dawn of DSL.

I doubt it'll be a large proportion of urban users though. I must admit I lived for a year on this kind of service, but this was because the town I lived in had some strange some strange local government monopoly on ISPs, and the monopolist delivered service that was barely usable (not hyperbole). The mobile providers circumvented this by "not being ISPs", and while what they offered was as terrible as everyone else in this thread has related, it beat out the even more terrible monopolist cable offer. I switched to fiber as soon as the monopoly was lifted.


There's a few things where it starts to make more sense.

1) In the US, its stupid expensive and difficult to be yet another provider laying cables to people's homes. When it comes to coax and fiber, there aren't local exchange carriers in residential areas; the company who laid the cable is the only one able to offer services on that cable. For another ISP to come in to compete at all, there's a massive amount of red tape and cost to lay essentially redundant cabling. Launching a new wireline ISP is stupid expensive and there's a lack of viable competition in many markets.

2) Cell companies have spent years splitting cells and making smaller and smaller cells in suburban areas. They've been building small towers all over the place and just about every water tower and office building these days has tons of antennas on it. So there's plenty of places to stick new equipment in suburban areas.

3) 5G is both a more efficient protocol and also a lot of new spectrum. So they can continue to use the low-band frequencies they're using for 3G/4G service for mobile users while then offering new services on the newer bands. Since they already have lots of places to stick antennas with fiber already laid, this is overall pretty cheap to more than double the capacity per tower. In reality, its far more than double, probably several times as much bandwidth per tower as the newer frequencies can help split the same tower into smaller sectors per tower.

4) A lot of those new bands are pretty much trash for mobile users in suburban settings. Low on the ground, in a car, deep inside a home, you're not going to get much service. However, if you put a small antenna near a window facing the tower, you'll get excellent service.

So to recap, there's a good bit of room for more competition in home internet services. There's already lots of places which are already set up to hang antennas with plenty of backhaul connectivity, so adding a ton of capacity per existing cell isn't terribly expensive for the carriers. 5G allows for more bands with far more bandwidth available overall. A lot of the new bands are trash for mobile users in many spaces, but are very practical for fixed wireless. This means fixed wireless 5G could become very competitive with wireline services in suburban and rural residential internet markets.


>"(not least because rolling 1month SIM plans are common here,..." Does that just mean prepaid SIM cards? If so how does the rolling part work? Does that mean you can roll-over unused bandwidth to the next month?


Not prepaid, just like a 12month contract but only 1 month, so you can cancel (or change the amount of data or whatever) any month and not pay (without penalty) the next.


I think it's also supposed to help with density. Ever been to a busy event (sports, concert, etc) and had no internet even though signal strength is fine? A lot of times companies will move in temporary/portable towers for big events but having more baseline capacity is good, too

I was under the impression 5G is pretty worthless outside densely populated areas

Other comments mention fixed wireless which is also big (home internet access)


With 5G I can pull 500 down on my phone sitting inside a building. If LTE hadn't become so saturated with users it would have been fine. I remember back in the day T-Mobile had faster 3G than 4G due to congestion.


I'm going to guess that you have home broadband internet that is reasonably fast and reliable.


But that’s just it. How is 5G-based rural broadband a significant improvement on LTE-based service?


More customers being able to be served from the same number of towers with better per-customer performance.

One challenge in rural broadband deployments is there are only so many good locations for the carrier to actually stick antennas. Maybe a water tower or two, you might get lucky and have some tall radio tower at the edge of town. So then all the customers in the area are essentially sharing a single tower, all needing to point their CPE to that one point. You can only slice the tower up into some number of sectors and once you hit that point the tower is saturated. Every additional customer is now diminishing the service for every other customer.

Having better spectral efficiency per sector means you can service more customers, have better service per customer, or even both depending on how you balance it.


I personally don't need more bandwidth when I'm on the go, but would love to see some real alternative to my local cable company.


I'd love to have a lot of bandwidth on my phone but the pricing in my country is tied to data usage and ridiculously high. 5G is essentially useless here except for a dozen customers or so who are willing to spend absurd monthly fees for an unlimited data plan (which probably isn't even unlimited if you look at the fine print).


Having a mobile hotspot has allowed me to work just about anywhere I want to. The more bandwidth the better!


Yes, but only in the context of tethering a more powerful device personally.


Each provider requires a separate chunk of spectrum under the current system. So a competitive market tends to multiply the spectrum requirements.


How about not having to pay for internet access for both your home and phone? I would be ok with an unmetered 5G connection for both home and mobile.


Then the network would need to have enough towers to handle everyone's streaming traffic and the cost of that would have to be passed on and eat your savings. Wired networks have lower cost when dealing with high data rates because they don't need scarce shared radio spectrum for higher speeds.

You can already pretty much get what you want by going the other way. Get a fast home internet connection, use WiFi at home and at work and get the cheapest available pay per use data plan for the small percentage of data you use while in a car etc. Doesn't work for people who use a large amount of data while traveling, but that's not actually most people.


Or get a 5g hotspot and use that...and even take it with you for mobility


It's not mainly about more bandwidth for individual users, it's about having more bandwidth collectively.


It's about density more than individual speed. More spectrum, used more efficiently.


No, but lots of people have had terrible LTE service in congested areas and thought "why can't the network provider support more users!?". 5G is much more about more efficient use of scarce bandwidth uses than it is about raw point to point bandwidth.


Many of the frequencies they’ve allotted are just not feasible without the access point being in the same room as you. Long term I see it as less of a replacement for 4G lte and more of a replacement for WiFi.

The idea is that you’re going to pay one cell phone bill for all of your devices if you want to get them online. I don’t know anyone stupid enough to pay for a SIM card for their personal iPad currently, because you’d prefer to tether off your phone instead of paying $10 a month. But in the future you might see people drop their home internet and just go with SIM cards in everything.


> But in the future you might see people drop their home internet and just go with SIM cards in everything.

I mean, maybe, but more likely to just be a 5G modem with wifi/ethernet, like has been available with LTE. I've got one with a $5/month plan as backup for my DSL, because my DSL is unreliable, and the local cable company doesn't service my house, and municipal fiber installation is unsubsidized and the actual installtion costs are too high, IMHO.



Thank you. In all these discussions what's curiously overlooked is exactly what the testing revealed.


I don't know anything about the specifics of how things got to where they are, just an outsider watching this regulatory mishap unfold. That being said ...

It's quite interesting to me that the FAA may end up grounding planes rather than carriers delaying a rollout of 5G. This seems to say a lot about society and government priorities in the US.

(personally I don't know enough to make an informed value judgement here, but as a technology worker I'm biased toward preferring more and better internet at the cost of harming other sectors)


Ultimately you have two separate agencies that can't block each other except by appealing to higher authority.

With FAA not being able to block the rollout, the only way of fulfilling their mandate is to ground planes.


For perspective, elsewhere in the world it's already been rolled out - in some cases with less buffer than is proposed in the US. The carriers have already delayed their rollout multiple times and the FAA still hasn't been able to prove there is an impact, rather they keep issuing defensible statements that it could have an impact despite having had years to work on this.


As a small time hardware maker who spends an inordinate amount of my limited resources trying to meet FCC rules, I find this infuriating.

Whenever I complain about how outdated and ineffective the FCC is, the literal pushback I get is "but what about airplanes and satellites". To which I say, touche', we should all do our part to make sure we have safe airports and open channels for satellites and astronomy.

Reading through a Link posted by @supernova87a (https://www.rtca.org/wp-content/uploads/2020/12/Slides-5G-In...) I have two take-aways:

1. The spectrum involved is valued in excess of $4.6bb

2. Most 5G deployments will create in excess of 100x (20dBm) the allowable interference for radar altimeters

From the report, page 18 (SC-239 study results):

HUGE AMOUNT OF SCENARIOS INVOLVED

Every base station configuration produces harmful interference both from 5G fundamental emissions in the 3.7–3.98 GHz band and 5G spurious emissions* in the 4.2–4.4 GHz band, across virtually all operational scenarios and relative geometries between the aircraft and base station.

5G user equipment (mobile phones) that may be operating onboard aircraft were also found to exceed the safe interference limits for Usage Categories 2 and 3.

I find it increasingly difficult not to be cynical about the role of the FCC with respect to public safety issues with a lapse like this.

If I shipped a product into the US that had a radar altimeter interference profile like this, I'd be fined and sued into the ground a hundred times over, and I'd accept that fine and outrage because I personally couldn't live with being responsible for any planes literally falling out of the sky because they thought the ground was in the wrong place on account of my product's interference.

But apparently, with enough lobbyists and money, it's an acceptable risk. It's hard for me to take the rest of the FCC's rules seriously, when they act as just another rent-collecting agency at the beck and call of lobbyists whose sole mandate is to create market barriers against small startups and imported goods, with little regard to actual safety, science, or modern technology...

edit - formatting


"Don't interfere with defective radar altimeters" is not the way the law works. The law says don't emit more power than you are allowed to within your assigned frequency band, and much less outside it.

The basic problem here is that way too many radar altimeters are fundamentally defective in terms of being sensitive to emissions they should have been designed from the beginning to ignore.

The FCC doesn't make any promises that listening to the wrong frequencies won't produce harmful interference. That is the receiver's job to avoid listening to signals on those frequencies.

And if somehow radar altimeters can't or couldn't be designed to ignore radio emissions outside their assigned frequency range, the FAA should have demanded a far larger frequency band from the beginning instead of implying the one that was assigned was adequate for the purpose and then later attempting to squat all over frequency bands they were never assigned.


The filter mask is reasonable for broadband radar, especially considering that we have to consider equipment older than 5G (I am plagued by FCC rules that protect AM radios, so why should radar altimeters be considered defective and not AM radios?). If you have to make a guess as to which is wrong, the legislation, or the laws of physics, I'd put my bets on the legislation being wrong.

Furthermore, things could be much worse because the filters that are used to prevent leakage into radar altimeter frequencies can fail on a 5G device, but the 5G device would continue to operate just fine without anyone noticing (except for the plane that ended up in pieces on the runway).

In other words, 5G transmitters would normally emit sidebands that jam altimeters, if not for a handful of components that prevent it from happening. But if one of these components were to fail -- say, a user dropped their phone and one of the filter elements cracked or de-tuned -- it'd just straight up jam an airplane's altimeter.

I'd like to think that regulatory bodies exist because they are supposed to consider public safety as part of their mandate. We don't want to lose our children in an airplane crash and have the best answer be "whelp, that radar designed 20 years ago should have anticipated 5G jamming. Therefore that altimeter was defective. Anyways, no actual laws were broken, so those lives lost is just the price of progress".

I think an alternative view would be that the FCC should have spectrum auctions not just "because valuable things should go to the highest bidder", but in part to raise funds to upgrade things like airplane radars so they could tolerate the interference caused by the new equipment that was going to be installed in an adjacent band. Government does this all the time: you want to move your business into the neighborhood? You have to upgrade the infrastructure to support the burdens of your new business. We didn't run megawatt-scale power lines to every suburban block just in case someone wants to setup a datacenter there, and if the power goes out because of the datacenter's loads we don't blame the residents of the town for not having anticipated the Internet.

And with over $80 billion raised in spectrum auctions, you'd think that'd be within the realm of possibility - one could replace 100,000 radar altimeters at $10k a pop for "just" a billion dollars, or about 1% of the auction proceeds.


The root problem here is that this used to be a satellite downlink band. So virtually no guard bands were required. It is like if someone bought a house in a residential area, knocked it down, and built a steel mill.


Verizon, AT&T, and T-Mobile spent nearly $100B for their C-Band spectrum (including relocation costs). I think that creates a lot of pressure for them to be allowed to put them to use. I'm not saying it should, just that it does.

I think the US wants to be a 5G leader and it sees delays as potentially sacrificing that leadership. This might not be a real concern, but people complain anytime the US isn't the #1 in something and blame everything from politicians to capitalism. At the same time, it seems reasonable to believe that companies that have access to new 5G capabilities (speed, capacity, latency) will be able to develop and gain marketshare in new services. If American companies are going to develop the next generation of NewThings, playing around with the capabilities and limits of 5G networks in the real world will be an important part of that.

We've seen different countries with different scenarios. European countries that have nearly twice the guard band; countries that are mandating down-tilts; countries that have strict power limits; countries that don't have as strict power limits and are comfortable with the risk.

Personally, I don't understand why we didn't go with a phased approach. Set low power levels and a downward-tilt mandate. Ease the restrictions month by month and measure the real-world interference. Sure, maybe it means that C-Band networks aren't quite as good as they could be for 6 months, but I think predictable goals and timelines based on real-world data would satisfy cell carriers (who at least state they're confident that there will be no issue).

> 5G user equipment (mobile phones) that may be operating onboard aircraft were also found to exceed the safe interference limits for Usage Categories 2 and 3.

In some ways, this is what I'm more worried about. People have a smug sense that their phone can't interfere with planes and that regulations around cell phone use on planes are just BS. While the FAA/FCC will be able to enforce restrictions on cell carriers, it's going to be impossible to enforce it on users in-flight. You can't effectively make sure that devices are in airplane mode. Flight attendants can't see the indicator on my device (or lack thereof) when quickly walking up and down the aisles.

If C-Band causes interference with radio altimeters, I can see the FAA and FCC adding exclusion areas around airports, mandating antenna down-tilts and lower power levels in certain areas, and other mitigations. I think it's reasonably easy to get AT&T and Verizon to implement those and reasonably easy to check compliance. They might fight it, but it's still reasonably easy.

I think if C-Band phones cause interference, it's going to be a problem since we simply won't be able to police everyone on planes.


One thing mitigating interference from phones inside the plane is that at those frequencies, the metal shell should prevent the signal from going that far, and once you're airborne the phone will stop getting usable signal fast (and thus stop transmitting without beacons from base station, in order to save power).

Problem is that some new popular designs are composite shells...


To be instrument rated (all airline aircraft), composite aircraft must have a metal weave in the outer layer to dissipate static electricity build up from flying through clouds. This preserves the faraday like signal reduction as a side effect. My phone's GPS seems to struggle bus as much in a 787 as any other airline.


Honestly I still dont quite get it. The FCC allocated 3.7 to 3.98Ghz for cellular wireless. Then there is 20Mhz of guard band to 4.0GHz. Then there is satelite from 4.0 to 4.2GHz. And then, finally, the altimeter fequency band from 4.2 to 4.4GHz.

So the two uses appear quite far apart in the frequency domain. Do signals get sent out on the wrong frequency or something? What does it mean to interfere? And, assuming something bad is happening, why?


Radar altimeters are inherently wideband devices, designed without any historical need for front-end rejection and the specifications for them reflect that - the RF "testing mask" that's used isn't a sharp rolloff at the edge of the RA allocation. Typical implementations are, for example, only 20 dB down for frequency-dependent rejection by 3.5 GHz.

With potentially high gains from beamforming, both the fundamental mode and spurious emissions from 5G radios exceed the safe interference limits for the current RA specifications by at least 10 dB.


It does seem like an overreaction to me but there is a big difference in cell towers and satellite uplinks/downlinks

Radio towers (sites) have widely varying quality control of their tenant's installations. Other transmitters can mix and amplify signals from other carriers without well designed and maintained RF filtering on each and every antenna. The more transmitters at a site the more carrier frequencies that can contribute to the mixing, and the more opportunities for poor or broken RF filters. At some busy sites there is almost nothing you can do to prevent this. Anything on the tower that acts like a diode can cause mixing, even without being amplified by another transmitter.

This usually just harms other receivers at the site however. But an airplane flying close by could theoretically be affected depending on how close it is.


>It does seem like an overreaction to me but there is a big difference in cell towers and satellite uplinks/downlinks

N.B. as far as I know, the portion of the C-band that's in question was allocated for satellite downlinks; uplinks were over 6GHz. With that in mind, there is indeed a Big Difference between a satellite in orbit radiating downwards, and a 5G beam forming installation on the ground potentially radiating upwards.


Yeah, and more importantly, was the equipment designed to exist with that much guard band? And isn't the cell equipment specified to throw off only a certain amount of harmonics, or whatever the out of band noise would be? This seems like without either a massive engineering mistake or a tempest in a teapot?


Some background here, if it's useful. This has been ongoing for quite some time, and has a lot of nuance to it, IMO. The following is my summary as a layperson. I've tried to quote sources wherever possible.

- The FCC, which is the government agency that has ultimate control over permitting what goes on in the nation's airwaves, proposed to make additional frequencies available for mobile phone use in 2018: https://www.fcc.gov/document/37-42-ghz-public-notice-opening...

- This was going to be done by "repacking" satellite communications in the C-band (3.7 GHz to 4.2 GHz) freeing up spectrum to be used by cell carriers.

- Boeing encouraged this in late 2018, mentioning that the FCC should give a 100MHz "guard band" to prevent interference with radar altimeters, which operate in the 4.2-4.4 GHz range: https://ecfsapi.fcc.gov/file/121184623679/Boeing%20C-band%20...

- The FCC implemented a 220MHz "guard band" based on studies with radar altimeters, leaving more than double the guard Boeing requested. This was based on studies to evaluate any interference using this frequency (https://arstechnica.com/tech-policy/2021/11/faa-forced-delay... is a good summary, there is a lot to read here).

- The FCC auctioned these 280MHz of airwaves (3.7 - 3.98 GHz) last year in FCC Auction 107, garnering $81,168,677,645 in gross winning bids. As there is a delay for the incumbent spectrum users to clear their usage, ATT and Verizon were only granted the use of a combined 120MHz of spectrum in limited areas, spanning the 3.7GHz to 3.82GHz range.

- Verizon and AT&T, the largest winners, have sought to deploy the spectrum they paid for in concert with the FCC rules. Specifically, this is the "lower" part of the overall winnings, which leave a minimum of ~400MHz between the top end of the broadcast frequency and the bottom end of altimeter operation.

- Airline carriers and the FAA came in at the 11th hour and asked the carriers and the FCC to halt their deployment, asking for more time to be able to study whether this was safe. After a lot of back and forth, Verizon and AT&T have agreed to some power reductions around airports and runways, and voluntarily delayed the rollout of broadcasting on these frequencies twice - most recently at the beginning of the year.

- The FAA has so far cleared the altimeters in ~45% of the commercial fleet: https://www.cnbc.com/2022/01/17/us-faa-clears-45percent-of-c...

Where we are now is a bit of a battle of regulators, airlines, and the cell operators that spent a significant amount of money on spectrum and capital expenditure (radios, antennas, etc) to deploy that spectrum.

The FCC has cleared the use of the airwaves, but the FAA insists that not all altimeters can avoid being interfered with, and are now issuing requirements that may ban the use of these instruments in cases where they otherwise would have been necessary. This may result in landings that may not be able to happen, causing diversions, etc.

The carriers and the FCC insist that over 40 countries are using this spectrum (which appears to be true) with various restrictions. In Japan, for example, there is a significantly smaller guard band of 100MHz, though the transmit power level is much lower than it will be in the USA. The carriers have voluntarily agreed to adopting restrictions around airports that roughly mirror that of France.

At this point, it's unclear what happens next. I am sure I am missing a lot here (why are altimeters sensitive to broadcasts happening 400MHz away from them? are they?), and there seem to be valid concerns and issues on both sides. Ultimately, this is largely looks like embarrassing game of cat/mouse between the FCC, FAA, cell carriers and airlines.


It seems like Boeing evaluated the altimeters they use and determined that with the filtering they have only a 100MHz guard band was needed.

Apparently though there exist radio altimeters with basically no filtering at all, which worked properly in practice, since sat downlinks had nowhere near the power level needed to interfere. But now we are adding more powerful transmission closer to the altimeter band than was used previously, and altimeters designed without using meaningful amounts of filtering would be affected.

That claim that 400MHz of guard band (as this initial rollout still provides) might be insufficient for some altimeters is really something. Ideally the FAA can determine which ones they are, and deem them no longer airworthy.

I'm also wondering if in fact it is just that the standards for altimeters are far looser than what all altimeters actually implement. The FAA concerns seem to be based on the signals potentially interfering with an altimeter built to just barely meet the testing specifications, but they might all actually greatly outperform those specifications.


Note that the adjacent band interference concern is with radio altimeters, not sensitive pressure altimeters. (The article makes this clear later, but not on the first mention of altimeter.)


While your statement is true, this does not diminish the importance of radar altimeters: the barometer is not specific to the landing site terrain like radar is, but rather related to distance above sea level. In low visibility, radar is the only way to tell how close to touchdown you are.


This isn't really true. Before landing an aircraft gets the pressure altitude of the airport and sets their altimeters with that as a reference before landing. It's a very common procedure. These are updated regularly.


Radar altimeter is used for low-visibility approach collision avoidance and for triggering events in last few meters on AUTOLAND-equipped planes (this is rarer in USA but common for example in UK airports). The main glide slope is managed with barometric altimeter, sure, but that one lacks the detailed resolution of the radar altimeter and can't detect ground - it can only follow the difference from specific pressure according to standard model atmosphere.


While it’s true the barometric pressure is included in the ATIS information every plane will get before landing, pilots also rely heavily on GPWS callouts when landing and those are provided by radio altimeter and not the barometric one. It’s not an issue to land without them, and some approaches and emergency checklists require you turn them off to allow other messages to be announced without GPWS callouts taking priority - but it’s ideal to have them for sure.


I'm baffled why 5G was approved and deployed before there were safety verifications with aircraft


The only real question, imho, is why this wasn't understood when the frequency was auctioned. If we needed more guard band, we needed more guard band. If antennas point to the sky, don't use the close frequencies.

Any answers?


The information provided at the time indicated 100 Mhz buffer would be enough, which the FCC doubled. Now the FAA are saying that in fact even 200+ Mhz buffer outside of the frequency that was ever allocated for this purpose is not enough for some altimeters.


Here's the list of the 50 airports with 5G buffer zones. At least the Feds will be eating their own dog food: Neither Reagan National nor Dulles are on the list (nor is Baltimore-Washington International, for that matter).

Edit: The link is a pdf.

https://www.faa.gov/sites/faa.gov/files/2022-01/50%20Airport...


Update: Reagan National apparently doesn't have a tower nearby, so it doesn't need to be on the list.


The only opinions that matter in the end is the pilots', I want to hear from them. Are you willing to fly a loaded jumbo into an airport with a flaky altimeter during a zero visibility landing?

Airline and manufacturing execs and the FAA/FTC are not risking their lives every day (although it might be eye opening if they would be willing to all cram into a plane and land in a thunderstorm at an airport with 5G towers at the end of the runway).


Pilots don't know anything about RF beyond "VHF is line-of-sight" and "use HF over the ocean." I'm inclined to listen to their (our) concerns but we don't have some special understanding of the underlying technical issues.


For most here, this will be familiar, but what this is affecting is the radar altimeter.

You've probably seen the videos where the system will do an audio countdown "100, 50, 40, 30, 20, 10, retard, retard" [Airbus].

That system is critically important in zero-vis triple-autopilot landings. You don't want cell phones screwing that phase up.

https://youtu.be/UV_vWtAJIow?t=142


You don't want to be using a radar altimeter that can't handle a 200 Mhz guard band at 20 feet. That is seriously messed up on.the plane MFG front, not cell phone front.


This is what I can't figure out. 200Mhz is a large guard band, and it isn't hard to design filters for that at all.

Now, single digit khz filters at RF frequencies, that's difficult. But not 200Mhz!

I guess the radio altimeter manufacturers just figured, "there's a large guard band, why play it safe and put in filters", or what?


The odd part is that Boeing originally proposed a 100MHz guard band. It's literally twice the headroom a major manufacturer suggested for this purpose.


Even weirder, the critical phase of all this is generally during airport approach and landing so at pretty low altitudes. In terms of reception if you can't design something to tell you if you are 100 / 50 / 20 feet off the ground with a 200Mhz guard band ON BOTH SIDES that is insane (600Mhz total bandwidth to tell me if I'm 20 feet off the ground??).

If you are ham these bandwidth needs are just nuts.

And this has been 10 years in the making. The FCC must just be rolling their eyes - why didn't FAA do something earlier.

We have coverage during landing phase with a localizer and glideslope as well, plus RNAV / WAAS enhanced GPS options etc etc + the mark 1 eyeball. Somehow in the last year the US has just become helpless to solve problems. Hard to beleive folks flew across the ocean with no GPS or moving map etc etc.


Why is this only a problem in the U.S. if 5G is already active in China and many European countries. Obviously airplanes aren't falling out of the sky in those places. Seems like much ado about nothing. If anything this appears to be a FAA/FCC-induced problem where aircraft are being blocked from flying from over-zealous regulation.


i'm not an expert, but it seems like other countries use lower power levels compared to what the FAA wants to use. https://www.bbc.com/news/business-60042178

also, and this is just my opinion, but the absence of plane crashes does not mean it's safe! some faulty things exist for years before anything happens and/or the issue is noticed.


Different frequency allocations in different countries. Nearly nothing uses the frequencies FCC wants to enable at given powers and rules.



I'm unable to understand whether this is something to genuinely be worried about, or if it's red tape.

It sounds kind of hysterical, sort of like programmers worrying about cosmic rays. Is this something that's deeply improbable and insignificant, or is it something that is guaranteed to happen eventually even though it is unlikely?


> is it something that is guaranteed to happen eventually even though it is unlikely?

No. If you have an instrument operating at 4.3GHz and it suffers interference from radios operating at 3.9GHz, it indicates inadequate filtering. That's a design issue, not an event with a statistical chance of happening no matter what.


Seems like:

- downside of delaying 5g is slightly longer download speeds (?)

- downside of interference is planes might crash (!)

why not just give the FAA all the time it wants


why not just give the FAA all the time it wants

For the same reason we allow multiple private companies in multiple countries to pollute the skies with (eventually) millions of satellites:

Sacred, holy, internet. Without which all human civilization could crumble to dust and blow away.


——


"Passengers should check with their airlines if weather is forecast at a destination where 5G interference is possible," the FAA said Sunday.

It only affects landing in bad weather.


Really, they're going to ask passengers and airlines to publish and follow up to find out if this is gonna be an issue on their particular flight?

Passsengers should also be sure to look up all the latest airworthiness directives and advisory circulars for the plane and engine type they're going to fly as well. And get the crew medical reports just to be sure.

Hell, maybe passengers should just as well check that the aircraft manufacturers haven't modified any critical safety systems that the FAA should've been regulating.


US airlines warn C-Band 5G could cause 'catastrophic disruption'

https://www.engadget.com/us-airlines-warning-letter-21191993...


Better title "US airlines warn government ineptness in slow approval of regulations could cause 'catastrophic disruption'". This is a regulatory failure not a technological one.


I am highly skeptical that we need additional 5G spectrum. We're not even fully using the spectrum that has already been auctioned.


Carriers are not currently using the spectrum that they gained in Auction 107, because of the issues that this article mentions. Their agreement with the FAA has given Verizon and AT&T the go-ahead to begin operations on this spectrum tomorrow (outside of existing FCC-approved test sites, which have been operating for months).

Which spectrum are you referring to that carriers have purchased and not deployed, if not 3.7–3.9 GHz?


How about make free Wi-Fi mandatory at all planes above a certain passenger count? Minimizes but does not eliminate the issue.


The 5G cells in question aren’t for the people on airplanes, they’re providing service in the area of the airport.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: