Hacker News new | past | comments | ask | show | jobs | submit login
USB 3.0* Radio Frequency Interference on 2.4 GHz Devices (2012) (intel.com)
364 points by cyberlab on March 29, 2021 | hide | past | favorite | 226 comments



I'm surprised this isn't discussed more widely (though even vendors know about this [0]). My wireless Logitech mouse didn't work properly when I had a hub connected to bring some connections to the front of my iMac (both with Bluetooth and the dongle the mouse had a significant lag).

It's hard to believe that there is a standard and devices are widely deployed that mess so so much with their environment.

[0]: https://support.logi.com/hc/en-us/articles/360023414273-Wire...


> It's hard to believe that there is a standard and devices are widely deployed that mess so so much with their environment.

This is entirely a problem of our own making.

We give cellular providers hundreds of MHz of exclusive spectrum access, and then deprecate and auction off the old analog TV channels to give them hundreds more, but we expect everybody else to shove all of their traffic into a 100 MHz block at 2.4 GHz.

It's not that this standard is doing anything particularly nasty to the environment, it's just that it happens to be messing with the one tiny head-of-a-pin that we've decided everything needs to live on.

It's a computer controlled radio for crying out loud, give us 400 MHz and let the computer figure out how to hop around.


>This is entirely a problem of our own making. >We give cellular providers hundreds of MHz of exclusive spectrum access, and then deprecate and auction off the old analog TV channels to give them hundreds more, but we expect everybody else to shove all of their traffic into a 100 MHz block at 2.4 GHz.

I don't think this is a bandwidth issue. USB is a wired standard. Using shielding is not that hard, but doing it inexpensively, while maintaining cable flexibility and low attenuation at high frequencies is very hard.

I wonder how do manufacturers of those devices manage to not get caught by the fcc in us and equivalent agencies in EU. Perhaps they sell directly to customers overseas?


The market rate for 400 MHz is $80 billion. There's only one industry with enough demand and revenue to justify that purchase price, which is mobile.

2.4GHz may be crowded, but it's nowhere near as bad as mobile where the demand for bandwidth has been doubling yearly for two decades.


The FCC's job is not to get every last penny for every last Hz, it's to be a good steward of the commons.

There is utility in the general public being able to use more unlicensed spectrum at home and in the office.

We can argue about what the right breakdown might be, but I'll start by asserting that ~3 GHz in total between Verizon, T-Mobile, and AT&T, vs. only 250 MHz in total for the entirety of local, short-range wireless communication is absurd.

On Wi-Fi, everybody just shouts at each other. On mobile, providers buy huge swaths of spectrum, partially as a monopolistic strategy to make life harder for competitors, and partially because they're still using cell allocation strategies from the '80s. They maintain exclusive rights to blocks that they are not using in a region, because the towers two cells away are using them, and it's just easier to use a fixed checkerboard allocation.

Both Wi-Fi and 3GPP standards can and should be improved to make better use of temporarily unused spectrum.

A good start might be to prevent carriers from having exclusive rights to any spectrum. At least one layer of the cellular protocols should be standardized across carriers allowing towers and phones to dynamically request and then relinquish spectrum on an as-needed basis.

Today, if 500 people in a room use T-Mobile phones, but Verizon owns all the spectrum, then nobody gets to use anything. This is stupid. Verizon should have access to a fraction of spectrum proportional to their users in an area. More users, more spectrum, and vice versa.


>Today, if 500 people in a room use T-Mobile phones, but Verizon owns all the spectrum, then nobody gets to use anything. This is stupid. Verizon should have access to a fraction of spectrum proportional to their users in an area. More users, more spectrum, and vice versa.

That would require a fairly radical departure from the infrastructure of the existing cellular networks, wouldn't it? Right now, each provider has a monopoly on a portion of the spectrum within a defined geographical area, and provides the base station and backhaul infrastructure to support their network.

It's not like "Verizon owns all the spectrum in a room"; it's that Verizon has better base station coverage of that room than T-Mobile does.

The providers compete on, amongst other things, coverage and network performance. Sharing spectrum would effectively require mutualization of base station infrastructure. You would effectively have a single monopoly with the networks operating as virtual operators. It's very far from clear that would be a good outcome, to me, at least.


They could share spectrum with different equipment in the same way multiple WiFi routers can share spectrum. The underlying reality is cell networks don’t actually need a lot of spectrum, instead their trading owning a lot of spectrum to reduce the number of cell towers because they aren’t charged the full price of that spectrum. Add some significant property taxes for that spectrum and you bet they would be selling or handing a lot of it back to the public domain.


I bet they would just pass on the extra property tax costs to the customers and keep their spectrum.


> That would require a fairly radical departure from the infrastructure of the existing cellular networks, wouldn't it?

I think it can be done at the legal layer - just require roaming agreements between providers. They can already roam across borders so there's no technical barrier.


I'd rather the telephone industry spend $1 on hardware than see them spend $1 on license fees.

The story now is "$80 billion in license fees" and "$20 billion in hardware" and that seems the wrong way around -- when most folks play poker the stakes are supposed go up, not down.


The money is made up! The FCC could sell spectrum for pennies, or give it away, or restrict access to The Right People, or wash their hands of it, or do another auction. They've done all of these things at various points. The license fees are made up.


The way it's now done, will put the spectrum there, where there's the most value. However the result is that it's actually a tax which the consumer will indirectly pay.


> where there's the most value

It's where there is the most money, not value.


> The FCC's job is not to get every last penny for every last Hz, it's to be a good steward of the commons.

That’s the spin.

Yes, the FCC is the steward. But it’s also a bureaucratic, underfunded, somewhat backwards agency.

SIGINT is pretty cool, and FCC has some cool things and people that interact with them, I’m sure. But the outward face of FCC and how they appear to operate seems to have hardly changed in past years. I respect that typically, but it’s like an old bowling alley kind of respect; it’s fun, but it’s old and dirty, and the nachos aren’t bad but they aren’t good. The Lysol sprayed in those shoes is for psychological comfort. You don’t need to know the real reason.


FCC should just bill per joule emitted.

All licenseholders should add up their transmitted joules of energy at the end of the year and pay the FCC. It will be in everyone's best interests to make most efficient use of the spectrum because then they can get their information to its destination with fewer joules.

Devices like home wifi routers should be able to buy upfront a license for X number of joules/year, and the cost of that is included in the purchase price of your router/laptop.


> The market rate for 400 MHz is $80 billion.

If market rate were justified for everything then we wouldn't even have space reserved for HAMs.

> There's only one industry with enough demand and revenue to justify that purchase price, which is mobile.

There's plenty of industries with enough demand. There's just no way they can compete with the money that mobile offers. That's a damn shame given that money shouldn't be the only, nor even the main, driver.


I think OP was using a rather consequentialist and capitalist economic model, where the value of a thing is by definition equal to the money that you can get for it on the open market.


Or maybe just that governments rarely pass up on that kind of revenue unless it's a huge PR win, in this case it wouldn't be very noticeable.


This is a part of the mobile market. They're the same market. 5G providers have frequencies that can't travel through walls and don't even make sense for a long-distance network. Those frequencies should just be part of the Wifi standard.


It's our spectrum to sell or not sell. No industry needs to "buy" public bands, we can just decide to allocate them for the public.


It very much depends on which band that spectrum is in. If 400MHz was worth the same everywhere, it's doubtful that amateur radio would have >4.7GHz of it allocated on a primary basis in the EHF band, for example.


Give it time. Once these bands become more utilized, and more hardware gets pumped into the system, this bandwidth allocation for hams will shrink just like it has in every other band. They'll find any reason to reduce these allocations in favor of some industrial monopoly that may or may not even use what it takes (cf. UPS).


Kinda wild that there's such a steep market rate for being allowed adequately communicate over the open air.


What about 5GHz wifi? How does it work?


It "works" but it is a mess. Large chunks of the spectrum require some combination of radar detection and power reduction, and if you live near a radar station you basically cannot use 160mhz wifi.

I will also add that 60ghz works great, but it will not penetrate walls. I have found it penetrates my doors and thus makes a great replacement for 2.4ghz/5ghz mesh networking in my apartment, but not much else.


> if you live near a radar station

This is actually incredibly annoying, especially in a residential setting where the 2.4GHz band is incredibly noisy (5GHz doesn't penetrate as well so it ends up being less noisy).


At that frequency, it becomes really hard to penetrate walls or travel without direct line of sight.


This would be more believable if Dish Network wasn’t squatting on a ton of spectrum.


There is a big block of spectrum around 5GHz for unlicensed use, and another huge one in the 6GHz about to be opened up.

The trouble is that protocols like ZigBee and ZWave and Bluetooth and ANT+ are stuck in the 2.4 GHz band and practically you cannot turn off your 2.4 WiFi access points and be happy.

Thus I have to go to the woods to pair my fitness band with the heart rate sensor because at my house who knows what is going on with the hue light that is on the wrong side of my monitor from the hue hub or the smoke detector that posts the battery status to SmartThings, etc.


The big chunk around 5ghz is more restricted e.g. DFS/TPC requirements. Using 160mhz channels is impossible in my NYC area apartment because of those restrictions.


Zigbee is in 2.4GHz; Z-Wave is 900MHz. https://en.wikipedia.org/wiki/Z-Wave#Radio_frequencies


Its not just 2.4 that usb3 kills. I've had it jam gps from a couple feet away and thats down at 1.3. Did some emi tests, it was really broadband interference. I really have no idea how these chipsets got approved.


GPS is sub-thermal noise and the receivers don't have a ton of dynamic range. Your GPS may not have a good filter on it and might be saturating, rather than necessarily broad-band RFI.


We put the device with usb on it in an emi chamber and watched it on a spectrum analyzer. Unless it was in a taped box there was a mountain of noise.


> give us 400 MHz

We have that, it's called 5GHz.

And also 6GHz now, that allocation is 1200Mhz wide!


5GHz works well as long as your access point is in the same room.

In real life one of the drivers behind having so many wireless devices in the first place is avoiding the effort and cost of laying cables everywhere, 5GHz often doesn't solve that problem well.

For now I'm using powerline devices to connect rooms with their own set of problems, nothing beats having a well thought through wiring in the house though.


My house is over 100 years old with fairly thick walls, knob and tube wiring, and a lot of neighboring 2.4Ghz access points. I ended up going under the house and running cat6, which was no small feat considering how tight the crawlspace is.

Of course, the cat6 cable I used subsequently got recalled, and so the manufacturer had to pay for a contractor to rerun it. They said that it was the type of job they wouldn't have even quoted for any price originally because it was so gnarly.

I have 2.4Ghz and 5Ghz from two APs on either end of the house. I turned off support for pre-n speeds on the 2.4Ghz to hopefully save some bandwidth on the beacon frames. I have Ethernet over Power to my garage, where I have a third AP for our inlaw unit.

The Ethernet over Power seems to be pretty good, but I had to find the right brand of equipment for it to not be flaky. WiFi still sucks, but my desktop and TV are hardwired, and it's good enough for mobile devices. I can go for about a week without losing my work VPN connection, through wifi through Ethernet over Power to PDSL.


If you care about interference ethernet over power is awful. It basically turns all of your electrical wires into a giant antenna and broadcasts that as broadband noise over multiple miles.

The only reason the things aren't outright banned is the frequencies tend to be sub 100mhz so it only has a significant impact on HF. Still, nasty things and many have been found over the legal noise limits.

If you can I recommend point to point 2.4ghz wifi links. Since wifi is regulated by ERP they have a high gain antenna that usually punches through things better than an omni AP. I've done them through multiple walls from 300-400ft away.


Several years ago I did some experimenting: https://imgur.com/gallery/0zv5m

This is the using a hackrf and I'm sending a file vs. not sending a file locally (i.e. to roughly max out the throughput) using powerline ethernet


That certainly sounds reasonable. I've poked around with an SDR though, and the only egregiously noisy thing I've seen is the one 100Mbps Ethernet link from the DSL modem to the router.

Before I ran cat6, I indeed used WDS wifi. It wasn't reliable, though, and the particular hardware I had suffered from a bottleneck somewhere that seemed to limit throughout to 6Mbps or so. I could have spent another $500 on less crappy equipment and maybe made it work, but moving to wires everywhere I could was highly effective.


In apartment buildings, there’s no advantage in a wireless protocol that penetrates walls—quite the opposite. I don’t want my neighbour’s router and mine constantly shouting over each-other. I want the signal to end at the wall.


Agreed, this would be ideal. Most people don't understand that literally every access point on a given channel is part of the same broadcast domain even if they're totally separate networks. Every AP that yours can see on the same channel slows down the network for all of them, because wifi is half-duplex. Everyone needs to shut up for a second while one host on the channel transmits.


I see more than 20 networks in the building I'm in, so that is already happening.

Most people I know don't have cables going to different rooms, there's a router where the signal comes in and wifi from there. Often there's not even the choice which room or part of the room that is.

Unless your building is relatively new and someone thought about setting up the cables, it can be very expensive to do so later.


> very expensive

To lay a cat6 cable and have it snake over the ceiling in a cable gutter? The cable I grant you isn't dirt cheap but it's by far the most expensive part of that arrangement.


I live in a small house in a big city and 5ghz is the only usable spectrum. The connection is solid through walls and there is a minimum of interference from neighbors -- I can only see my immediate neighbor's wifi, whereas 2.4ghz I get broadcasts from the entire block and can't even stream Netflix

In fact the lack of range works so well that I can use almost the entire spectrum for various networks and not feel guilty


Ran into an issue with my Home Assistant setup, using a USB dongle to create a Zigbee network. Had to use a USB extension cable to move the dongle a few feet away from the Intel NUC that it was connected to.

When I was searching for solutions to my intermittent problems, I didn't believe that an extension would solve it. It just felt like "blowing into the NES cartridge" to me. But I kept seeing the same advice, so I gave it a shot and whaddyaknow? Zero issues afterward.


My Conbee II is struggling with constant disconnecting--to the point that I've had to put my Home Assistant project aside until I can find more time. I've tried putting it on an extension cable after finding a stray GitHub issue advising so, but it was no use.

I'm certainly going to be looking at the problem through a new lens now that I'm aware of these wider issues.


FWIW, the conbee II is... not a great device. It has what seems to be a fundamental flaw with its USB firmware which makes the device reset itself at regular intervals. The result is that it disconnects from the host and reconnects (as if you pulled the actual device out of the plug)

I used to run HA in a VM via qemu/libvirt and it would always fail because when the device reconnects its on a different port and the USB pass through doesn’t work anymore.

Now I’m running the HA VM on proxmox which deals with disconnects reconnects better for pass through devices: it’s able to reconnect it to the vm without my intervention.

Looking at the logs, this happens between a couple times a week to 10+ times a week.

I tried an extension lead (which had no impact, I think this solves zigbee radio issues but not USB issues), using a powered USB hub, using a USB 2 port, a USB 3 port, three different machines... no dice. It really is a bug in the device itself. And Dresden electronics has piss poor support (still waiting for any answer from their email support six months later). The only avenue for “support” is via GitHub issues where other users answer but not actual Dresden electronics employees.


You beat me to it. I have both a Conbee II and a Nortek HUSBZB, and the Nortek is much more reliable. Whether it's better hardware, or if it's the software (Home Assistant's ZHA), I don't know.

But same experience with Dresden/deCONZ. It's flaky.

Right now I have split networks while I lazily migrate devices away from deCONZ. So far every glitch I've encountered has been on the deCONZ network.

Which is a shame, because I really like the software UI. ZHA doesn't expose as much (or not as simply). But it works better, and that counts for more.


Same deployment scenario here (home assistant on libvirt, usb pass through) and my Conbee 2 has been rock solid, with xiaomi buttons, temp probes and door sensors.


Same. Latest firmware is stable for me, made sure to select a channel that didn't interfere with any wifi networks, and put it on an extension cable. Extremely stable now.


Try new Texas Instuments devboard for ZigBee - insane amount of power and high quality of radio. Zero problems after switching to new stack.


Wow, is this why i have lag on my MX Master mouse connected to thunderbolt dock whether im on my mac or work PC? It is frustratingly laggy at times.


Some friends seem to think I'm a luddite when I say this, but this is one reason why I have no wireless mice, keyboard or headphones. Why introduce batteries that can lose charge and potentially unreliable wireless connections when you can have a cheaper, more reliable device with a plain old USB cable? The same goes for wifi. I make sure my work computer is wired through ethernet. Why would I want to risk downtime or random disconnections during meetings?


> Why introduce batteries that can lose charge and potentially unreliable wireless connections when you can have a cheaper, more reliable device with a plain old USB cable?

The answer, of course, is that you no longer have as many wires on your desk and in your bag. If you need very high reliability and fidelity, like if you’re a professional gamer or something, or you know you’re in noisy or otherwise troublesome environments, by all means use wired peripherals. I also use a wired keyboard and mouse, but it’s only because my favorite keyboard and mouse aren’t wireless and I haven’t found great wireless alternatives.


Honestly, I've debated going back to wired. I never move my keyboard or mouse from my desk, so I might as well bring back wired peripherals.


Keyboard has zero value in being wireless. Mouse... that's the tough call because the cable is annoying on a mouse.


There is some little value if you're short on USB ports and both mouse and keyboard use Logitech unifying... But it's a small advantage


one man's useless is another man's gold.

i love my bt keyboard and the freedom it gives me in the flat.


Imagine of your TV had wired remote controls. Never changing batteries, never having to search for it. What a dream. (No sarcasm btw)


I have a standing desk and I love to walk and talk so being able to pace unencumbered during interminable zoom meetings has been invaluable to me.

Logitech's wireless is spotty on a lot of products and their G hub application is awful but I love my Logitech headset.


Yeah, if you get a little USB extension cable or one of the Apple USB breakout dongles and plug that into the thunderbolt dock/hub, and plug the RF dongle into that, the problem goes away.


Thank you so much! Trying this now.


I miss that wired Apple keyboard with the the two USB sockets on the sides. Still have one but don't use it since it's USB-A and would need dongle plus extra cable.


Careful that one has a bad design too: without an extension cable it often doesn’t work. I’ve read it might be because it’s not drawing enough power to switch on.

Apart from that it’s just amazing, better (not as flat) keys than the magic make it perfect to type on, the usb on the side makes adding a wired mouse easy (and keeps the dongle close for wireless). I ordered several when I heard they were discontinued but still in stock at my old company just in case.


So is this why my MX keys and mouse drop BT randomly on my MBP ? I noticed it was more likely when I plug in stuff like android phone for debugging but I thought it was a coincidence... so are there ways to fix this ? shielded cables and hubs ?


Are you using Type-C to Type-C cable with an Android phone? Yeah well, finding a USB2.0 cable with both Type-C ports might present a challenge.


They're easy to find. Anker, amazon basics, monoprice all have them.


Yeah I thought it was relatively well known. I have my Logitech wireless dongle on a USB extension cable just to get it away from the hub.


Would be nice if Logitech made this information available on their support website.


Right? I have a set of 4 F710s game controllers hooked to a pi 4 emulation box and I'd been looking everywhere for info on what to do about their flaky connection.


Nothing Logitech has worked for me in USB 3.0 for half a decade now and I can't believe they don't yet admit to it on their front page in big letters and make 2.5 or 2.6 GHz or whatever adjustments are necessary to avoid interference.

The only thing that has worked for me is to plug the Logitech receivers into a USB 2.0 only hub.


I use 3 Logitech USB wireless devices (mouse, keyboard, headset) for my iMac and have had zero wireless issues. It's weird.


You might not have a hub. I’ve added one because the back usb and card reader were hard to reach, that is when the problems started. It might be the dock but I’ve had issues with others too and I don’t want to order any dock I can find and test which ones don’t interfere, especially after I did some research and found out it may be linked to USB 3.


I have a hub with no issues

Seeing as there is a wide range of quality for USB hubs, part of me wonders if this is just poorly designed ones worsening potential issues


Possibly, but given that even Intel and Logitech comment on this issue I won't take the cost and effort of testing them all and reviews are often also pretty worthless.


I see a lot of misunderstandings in some comments : we are not talking here about desired RF emissions but about unwanted EMC emissions at harmonic frequencies which just happen to interfere with the regular 2.4 GHz devices.This has nothing to do with how much spectrum is allocated to wifi vs mobile vs whatever... Here it is only about EMC (and it is well known for years that USB is difficult with EMC and it requires a lot of care on the PCB design !). And USB3 is not the only product that leads to difficulties : there are for example a lot of debates (at least in Europe) now with regards to the impact of LEDs (which might surprise a lot of people). Another example are debates on Wireless Power Transfer (which are desired emissions but with very strong harmonics that affect sensitive radio services)


Do you have links you could share regarding this LED debate?


I'm guessing it is LEDs with PWM brightness or colour control.

eg the badly designed PWM radiates RF crud from the sharp edges.


(2012)

It also interferes with GPS and Iridium. USB3 is a broadband jammer.

Previous thread on the subject: https://news.ycombinator.com/item?id=24707479


Experienced terrible headaches trying to fix issues with GPS and USB3 at work. I was also surprised by how sparse information online was. Almost all mentions came from random drone hobbyist's forums when I was searching some time ago.


Same here while working on a UAV even though the GPS module was mounted about 20 cm away from the single board computer. We looked at it with a spectrum analyzer back in the lab and the USB 3.0 ports really sent out some wide band interference that covered the GPS bands. Using USB 2 cables fixed it (apart from USB being really not great mechanically for a combustion engine powered UAV).


Yup, this is well known in the small UAS industry at this point, and I was part of a lab that lived through that "wtf, how does nobody talk about this" experience. It seems like many folks in the space have learned this the hard way.


I make gps enabled cameras and am battling with RFI from USB3. Juat ordered a bunch of chokes to try on cables to mitigate. Im surprised this level of RFI is allowed. Either that or these devices aren't being tested properly. Like they are powered up for testing but aren't used with actual data being sent to and from them


In my experience, USB connected products are EMI tested only with the cable shipped with the product in the box and used per the product instructions or in a "typical" use. Quite a decent amount of effort and cost is spent for some products to ensure that the cable which ships with the product will properly comply with all regulations regarding interference in the countries in which the product is sold.

There's no way a manufacturer could be expected to EMI test a product in every conceivable way a customer might wish to use it. For many products there's simply too many combinations of use-cases or features. So the rules generally require to test in a typical customer use or by following the instructions for use which come with the product.

As soon as you start using cables which didn't come with the product, it's on you to ensure that the emissions of the new cable and the product don't combine to cause problems.

Some cables (not just USB cables) are utter crap for interference.


A well designed product will have good shielding and PCB design, and effective low-pass-filters, all of which keep the stray interference within the enclosure. It should not rely on the cable to meet EMC testing, and the test protocol should define a suitable generic cable.

If it does require a particular cable, then the cable should not be removable.


In an ideal world, I agree with you. But if you ever try to make a mass market USB device you will quickly learn that the computers which your customers will connect your device to and the cables which your customers will use will greatly vary in terms of quality and compliance to the specs.


If we had appropriate EMC standards, then this wouldn't be an issue.

Shonky manufactures wouldn't be able to gain advantage by cutting corners.

The standard EMC tests should be specified using generic cables, connectors, etc.


"allowed" on unregulated spectrum.


Are you suggesting that you got headaches from radio interference?


I think he was speaking metaphorically about having to fix equipment that was interfering with neighboring equipment.


Ahh, this makes more sense.


I think they're suggesting that they do professional work on/with GPS and USB3.0.


As others have said. I edited the post for clarity.


your GPS device doesn't actually transmit anything


You can fix that! Get a BladeRF, HackRF, LimeSDR, and you can transmit your own GPS signals:

https://github.com/osqzss/gps-sdr-sim

(Seriously: Don't do this unless you're taking extreme care not to radiate outside your lab.)


I believe he meant that USB 3 devices were effectively jamming GPS receivers.


Yeah, it receives things.


I work at a lab where this is a major nuisance.

I’ve been wondering whether the clock spreading in the USB standard could be manipulated to notch out certain frequencies, like in noise shaping for ADCs.


I have had USB 3.0 devices cause noise which gets picked up by silicon strip detectors in particle physics experiments. These days its common to have a computer in the target room next to the digitizers to avoid long analog cable runs. Now you have to be careful which USB port you plug into with what device.


One of the problems is the many shitty USB cables about in the wild. The screening on many of them is pathetically inadequate. Not only do these cables emit noise from the USB electronics but also the USB equipment to which they are connected becomes susceptible to noise from external sources.

On more than one occasion I've had external USB drives lose their MFT (Master File Table) data as a write update was corrupted by noise. Essentially, the external noise killed the data on the disk.

Often, when one pulls these USB cables apart one can see the shielding coverage is 30% or less.


While good shielding is important, the important thing is to use twisted/balanced pairs, eliminate longitudinal currents by means of common mode chokes, and to eliminate ground loops via differential transformers.


I agree with you completely—and I'm not saying that as just an obsequious response but because I know the issues you've raised are so important. Yet nowadays they are so often overlooked or paid scant attention thereto—which I'd suggest is even more so than in the past because training now has to cover a much more diverse range of subject matter than in the past. Leaving specialist courses aside, today, general courses on electrical engineering and electronics pay less emphasis on low noise/low-level analogue techniques and more on digital technology. For instance, I recall a good but inexperienced engineer coming to me with a board he'd constructed that was full of digital ICs but which he couldn't get working. A quick glance and I saw that he had left off all the bypass capacitors across the ICs' power rails—not by accident but by not realising their importance.

There's not the time or space to address your points in sufficient detail except to say that years ago, I used to work for a well-known electronics company that made radio and television broadcasting equipment and I was employed in its prototype laboratory. My work covered a diverse range of technologies including audio, VHF and microwave. All those points you've mentioned were day-to-day bread and butter and we had to address them in detail as the equipment had to pass/comply with broadcast standards. For example, we had to design audio equipment so that the 600Ω balanced lines joining them had to reach a specified common mode rejection ratio, this involved multiple techniques, shielding, good earthing, using 'technical' earths and testing EMR leakage to and from interconnecting cables (for instance, Belden Beldfoil cables)—all of which were especially important in very low distortion designs that didn't rely on transformers to achieve balanced inputs and outputs.

Similar issues were relevant in RF and microwave equipment, minimizing stray capacitance and inductance in chassis/component layouts, etc. Rather than give you an example I refer you to one of my treasured books on the subject, Microwave Receivers edited by S.N. Van Voorhis, 1948, specifically pages 261-266, Section 10.6 (although other nearby sections are also relevant). This book is ancient but the theory and rules still apply. It's part (vol 23) of the 28 volume Rad Lab† series produced after WWII and this set is still one of the greatest tomes ever written on any subject in engineering. It can be downloaded from the Internet Archive from here: https://archive.org/details/MITRadiationLaboratorySeries23Mi.... (Incidentally, I still have my softcover Dover edition from when I was a teenager, it's not the kind of book you throw out.)

† List of Rad Lab series: http://web.mit.edu/klund/www/books/radlab.html


Thank you for the resources. I would be interested in your take on this guide: http://123.physics.ucdavis.edu/week_5_files/low%20level%20si...

We have generally had good experiences when sticking to these patterns, but are always looking for possible improvements.


My apologies for not replying to you earlier, I printed the paper out (as I like to make notes in the margins), but I managed to be sidetracked longer than expected. I downloaded the paper in a hurry and I initially thought you were its author and just wanted comment on it. It wasn't until I looked more closely at the download URL and saw 'ucdavis' that I realized that the file resides on that site and not the one where the author works.

There are several issues here: (a) this topic will time out in a few days and I'll not be able to contact you about the matter, that's likely a problem; (b) the paper address both general shielding issues and some very specific ones that relate to specialist research equipment; and (c), whilst I am happy to provide some general comment on the paper here at HN, there are some in-depth issues that I would prefer not to comment thereon given the paper's origin and the nature of my intended comment.

The reason for my point (c) is that I used to work at an establishment whose work concerned itself with hardware that is somewhat related to that mentioned in the paper and thus its instrumentation incorporated magnetic and EMR shielding of a similar standard to that [which would likely be] employed on equipment mentioned in the paper.

This raises two issues and it depends on your reason for asking in the first instance. If you want general information about shielding then I have no trouble giving that; alternatively, if you work with technology similar to that which is discussed on page 4 of the paper then it's a somewhat different matter. Thus, I would have to be much more specific as the shielding requirements are not only anything but trivial but also they are often much more demanding to implement. It's here that I have some issues with the paper and it would be unfair to air them in public without informing the author first (and as he's not sought feedback the matter is moot).

If you are involved in similar technologies that demand similar standards of shielding then I'd be glad to discuss the matter further in private but that seems difficult given there's no way on HN to exchange private messages/ensure privacy.

Anyway, please provide me with more specifics about your application/requirement and if I can contribute anything useful then I'll try to help.


I'll read it and let you know.


Would be interesting to see if this can be used for eavesdropping as a keylogger.


Have you tried wrapping everything with lead?


It's going to be a PITA when USB 832.1 comes out and we need a light-year of lead to deal with neutrino interference.


We do make liberal use of lead in general, but in this case the noise is electrical in nature. The detectors we use tend to behave like radio antennas and so they are well shielded electrically. I suspect the mechanism is the USB 3 making extremely sharp transients that make their way into preamplifiers via the power supplies. It's a slippery thing to track down, though.


Lead makes for fairly poor EMC shielding. It has relatively high resistance and quickly builds an oxide layer.

Copper tape is better, thick copper even better, and thick copper with welded seams better still.


Have you tried putting a Raspberry Pi 3 (without the USB 3.0) in between and sending the signals over Ethernet?


We have historically avoided using PIs for critical things because of innevitable memory card corruption after a few years. Now that there are means to boot from other storage media that might change. What i would really like is a BIOS option to kill 3.0 capability.


Only model 4 has USB3 (and that one comes with two ports that only have USB2), in case you don't need the higher performance.


This is also the main reason why I'm switching everything back to cables. I can barely use my bluetooth headphones in my flat just because I'm using a USB 3 hub. Headphones randomly disconnecting in video calls is not a great experience.

Related: https://annoying.technology/posts/08834ce6ea3edc5a/


The depletion of ports on computers and forced migration to dongles is a trend I am absolutely baffled by. Whenever I'm given the chance I will always go for a laptop with an Ethernet port and as many USB ports as possible - even if that results in a thicker profile. Weight is something I care moderately about, thinness is something that has zero value for me since we passed the inch and a half threshold.

I also, personally, have a strong preference for USB Type A connectors over the Type C and Micro variants, cable stability and port wear is noticeably lessened with the larger cable seating.


You're biased then or buying equipment not to spec.

USB-C ports are supposed to be 6X more durable tested to 10,000 connections vs 1,500 for USB-A.

https://en.wikipedia.org/wiki/USB_hardware https://www.anandtech.com/show/8377/usb-typec-connector-spec...

I've personally noticed the USB-C is much more durable. Something about the cage of the USB-A catching and bending more easily.


I think there is a bias now due to the higher usage with USB-C vs old USB. It used to be you'd either connect something to USB temporarily, like a flash card then back into your pocket after the file transfer, or permanently, like a USB mouse in the back of your desktop you never unplug again.

Now that the port is becoming a charging port, it's used a lot more than data-only USB, and in ways that torque the port worse. You plug your usb phone into a usb brick and it largely isn't going anywhere, but on a laptop, you are plugging and unplugging the device all the time. You might be leaning at angles with it on your lap and adding pressure on the port (something I inadvertently notice myself doing once a week). On top of that, the go bad part, the flimsy inner pin, is on the computer side rather than the cheap cable side. After a year of use, my usb-c port went from snick snick to wobbly, both on my macbook and my nintendo switch.

In contrast, I've never had this happen with lightning port on my iPhone despite all the abuse and lint I give it, because the design is inverted with the flimsy pin on the cable being inserted into a simple slot in the phone. The old macbook magsafe plug was just a magnet holding contacts firm against each other, not even inserted into anything, so if it got torqued or abused it would just pop off and there would be no harm to the computer or the plug really, and you spent no force or effort jamming it into the computer since the magnet did the alignment work and made the connection for you (Typically, I would just grab my magsafe cable a foot up from the end and loosely slap it against the port basically and it would seat itself).

With the shortcomings of the design of USB-C, in comparison to the older standards, you put on a lot more wear and tear on the port.


> In contrast, I've never had this happen with lightning port on my iPhone despite all the abuse and lint I give it, because the design is inverted with the flimsy pin on the cable being inserted into a simple slot in the phone

One point in favor of USB-C when it comes to wear is that the moving parts (springy tongues for metal contact and clips to hold the plug in place) are on the cable, whereas on Lightning they are in the plug. So in regular use with Lightning you are wearing out the port, whereas with USB-C you are wearing out the cable.

Anecdotally, I've worn out the Lightning port on my Apple smart charging case, but I don't have any USB-C phone to compare to (I have USB-C on my laptop but the use case is too different to draw any conclusions)

Also anecdotally, it seems due to required tolerances, it's much more difficult for third parties to manufacture lightning cables - if you buy cheap lightning cables on Amazon, they tend to have poor fit and get loose quickly. Whereas I've never had that problem with the cheap USB-C cables that get packed in with aliexpress junk.


I have a dongle hanging off my desk that has an ethernet cable plugged into one end of it and a keyboard and desk-fan plugged into the side. This dongle is attached to my laptop[1] and my laptop is effectively immobile 'cause quarantine. I plug and unplug my dongle perhaps once a month[2], otherwise it remains in place and my computer doesn't shift - neither is there any significant force on the ethernet cable other than gravity, there is plenty of slack. The cable is slightly cocked out of the socket with one side of the casing resting against the body of the laptop and the other side a few millimeters out. The gravity of the dongle hanging in this position for just over a year has slowly weakened the grip of the connector from what I can tell visually but there are no problems with the socket actually reading data off the connector that I've ever noticed.

The issue is a compounded one - most laptops come with two or maybe three USB-C ports with the expectation that you'll dongle the crap out of them - they may be more stable than USB-A sockets but they suffer a lot more wear from the need from dongles which comes about due to the lessening of ports available. So USB-C might be a lot more reliable if manufacturers didn't fetishize trying to provide the absolute smallest number of ports possible but as things are the UX today is far worse than the UX five or ten years ago when I'd have a litany of ports to plug in whatever I need.

I hate dongles and they work absolutely terribly with USB-C, give me a mix of USB-A and USB-C connectors so I can plug in everything without a dongle and I'd be happy as a clam. I thought I'd write this out just due to the fact that mistook a bad UX as a purely port based technical issue - I can't say I've had too many problems specifically with the USB-C connectors but every time I've had to use USB-C connectors it's been a terrible experience - if you follow what I'm saying? I think you may be quite correct about reliability but also missed the compounding factor.

1. Which rests on a laptop stand to angle the device to a better reading level

2. Every time the internet goes out the dongle fails to automatically reconnect for some reason I can't be arsed to figure out


I found my MBP ports started getting dicey after a couple years. In that vertical deflection and jiggling was necessary to get a good connection.

This was with ~5-10 plug ins a day, 5 days a week, for ~2.5 years. So around 3-6k connections.


I've never had a USB-A port die on a motherboard, but I seem to break USB-A ports on cases, and always the same way. The plastic support for the pins that also acts as keying breaks. Probably just that I tend to buy cheap cases.

Only port I've broken on my laptop is HDMI. Fortunately it has a thunderbolt port as well for video, but now I need a dongle.


Not my experience.

I’ve been using a major OEM laptop and dock at work for about 3 months. The cable is already jiggly.


It doesn't help that the dongles we end up buying are the cheapest off-brand trash available on Amazon. No wonder they leak RF like a garden hose.


Cables have another advantage: it's much easier to switch devices, just plug it in, no more connecting and disconnecting and figuring out how that specific device has implemented pairing.


At least the headphones I'm using Bose QC 35 can connect to 3 devices at the same time. If it's nicely supported it's not really an issue but as soon as I step away a few meters from the computer the connection starts to drop I'm better off with cables.

Kinda odd that you even have to think about basic features like that these days.


Mine do that and it's more an annoyance than anything.

I'll throw them on to listen to a podcast while I'm cooking or something. But then they'll connect to my desktop as well. Depending on what it's doing that day, sometimes it connects in headset mode instead of headphone mode which means it has exclusive control of the headphones and I can't get my podcast to play for hell or high water.

So instead I have to trek down to the basement, unlock my computer, disconnect the headphones from there, then get back to it.


Also using Bose QC 35’s, and only _today_ (oddly enough) I was getting incredibly annoyed at them for connecting to my phone and laptop (MacBook, in sleep mode, fwiw) and _not_ the Linux machine I had in front of me which was the “first profile” (the one that the device speaks about when powering on, “connecting to... <device name>“)

It seems mine can only be connected to two devices, and gets a bit weird when the secondary device wants to make sound but the primary device is already making sounds.

I have missed calls because of that.


My Sony WH-1000XM4 only connects to 2 devices at once. That's mildly annoying since I would like to connect it to 4 (Phone, Tablet, Laptop, Desktop). It stays paired just fine, but I have to regularly go to the bluetooth settings of the device I want to listen to and press connect. On all but one of those devices that's more work than replugging a cable.


And no more computers fighting about who's going to have the mouse.

And I found my Rpi using 50% cpu being stuck connecting by bluetooth to the digital piano.. for no reason. :)


USB3 seems like such a sh*t storm.

Too many features, no standardized labeling for what cables support, not truly reversible connectors, dongles and hubs that barely work unless you drop hundreds, etc.

I'm hoping to god we learn for USB4.


USB 4 is essentially Thunderbolt 3. It transports either USB 3(.2), PCIe, or DisplayPort in one of two variants.

Support for these features is still mostly optional. The only real upgrade is that -- as far as I can tell -- USB4 requires host devices to support DisplayPort. It's still a USB-C connector, so I don't see the cabling situation improving.


Come on, USB 3 was just gustier winds in the Giant Red Eye of shitstorms that is USB. USB 4 will innovate only in respect to how it manages to deliver new dimensions of incompatibility.


I miss USB2 and such because anything you bought would work for a good amount of time. It's hard to get 4 wires and a connector wrong.

I agree that that will be the most likely outcome, but I wish it wasn't going to be that way.


How many variants of features are there really in practice for cables? Particularly for "brand name" cables from Anker, Amazon Basics, etc?

I've only ever noticed Thunderbolt (which I've always seen denoted by a lightning bolt) and USB-C. And never really had a compatibility issue outside of that.

One issue I have noticed is that there are dozens of 5 and 7 port USB A hubs out there, but there are basically no 5 or 7 port USB-C hubs. Lots of multi-purpose hubs with SD slot, multiple USB-A ports, and maybe 2 or 3 usb-c ports. And those USB-C ports will often have higher output via power delivery protocol, so obviously there's a thermal limit to how many of those you can pack onto a small hub. But why are there no non-power delivery simple usb-c hubs that would be a drop-in replacement for USB A hubs? It seems like this is kind of an issue for more simple peripheries to switch over to USB-C.


> I've only ever noticed Thunderbolt (which I've always seen denoted by a lightning bolt) and USB-C. And never really had a compatibility issue outside of that.

Oh boy. When it comes to data, USB-C cables can be either limited to USB 2 or fully featured USB 3.2. But that's just one dimension - for high power PD your cable has to be e-marked, and for longer cables (1m) they have to be active in order to support Thunderbolt at full speed.


I'm hoping to god we learn for USB4.

If one looks at how well Bluetooth finally got straightened out after version 5, I think we have good reason to hope!


> not truly reversible connectors

What do you mean by that? USB-C connectors are not truly reversible? USB3 can also exist in USB-A and B variants that are indeed non reversible.


Because of how the connector is laid out, it's possible for devices to not work (or to work differently) depending which way up you plug it in: https://twitter.com/mifune/status/1373564866443759617


Oh that, I remember seeing the video. Are there any real world devices that have problems depending on the orientation though? This one is just a demo meant to demonstrate that if you really want to, you could make usb-c dependent on the orientation.


I bought this USB-C extension cable, and learned that it only works for USB debugging on Android when the Pixel's stock cable is plugged in the "correct" way. There's even a little label on the end of the extension cord telling you to reverse the connection if it doesn't work. /facepalm

https://smile.amazon.com/gp/product/B071DMMW4J


I've got a Nexus 5x that now only charges if my USB A -> C cables are oriented properly. When it was new, it would charge in both directions as expected.


Yes. Early USB-C controllers had notorious issues with CC pins getting fried out by short-to-Vbus events, which could make the port work only in one orientation (or not work at all if you're unlucky enough to fry both CC pins).


Perhaps not, I suspect that the pins are probably normally tied together. That it's possible, and some of the failure modes are _weird_ is a bit frustrating.


a round connector that can be inserted in any orientation would be ideal... we can only dream


It's way past time for USB over TRRS.


That would require a pretty long connector and a correspondingly deep port.


We could use 2.5mm headphone jacks. TRRS would be enough pins for USB 2. Just need marketing to convince people that slower and less interference is good and valorous.


why would it need to be longer then a current USB connector?


Because it is a round pin, the only way to get more individual contacts is to make the pin longer. If you try to distinguish in another direction you could not rotate the plugs anymore. Edit: whoops had an old page open


I don't think you can just take the current design and "make it round." You can't just put pins on a cylinder. They wouldn't align as you rotated the connector. The other way to do it would be co-axially like a headphone jack...but those are deep.


> I don't think you can just take the current design and "make it round."

While you are taking the rectangle and turning it into a circle, go ahead and change the design...


Lightning


Synology actually has an option in their routers to "Downgrade USB 3.0 device to reduce interference of 2.4G signal" (so I assume downgrading the USB 3.0 port to USB 2.0 speeds would decrease the interference)


Yes. The interference from USB 3 connections comes due to the 5Gb/s data rate of the super speed transceivers causing interference with 2.4GHz band radio devices. If you downgrade to a USB 2.0 connection you effectively are disabling the 5Gb/s transceivers and only using the 480Mb/s data link.


I wonder if USB WiFi dongles account for this at all.


They don’t. Realtek is probably the most popular, and they’re notorious for causing problems over USB 3.


I recently built a Bluetooth transmitter that can advertise by transmitting binary bits at 5gbps, which has essentially the same physical characteristics as USB 3.0 [1]. Whereas I used an FPGA, I wonder if one could intermingle the right bits amidst the rest of the USB 3.0 protocol to build a bluetooth transmitter...

It's not surprising but still insane that so many USB cables aren't properly shielded, thus making all of the FCC's efforts regulating devices effectively useless as your USB cable turns into an antenna to transmit garbage.

[1] https://twitter.com/newhouseb/status/1352796299700162560 (note this is running at 6ghz, but also works at 5ghz w/ more noise)


All the more reason to move to optical fiber+power cable like they were originally planning for thunderbolt/USB3.


I remember that optical transceivers are expensive. On the flip side, there was optical SPDIF way back when.


Not really, a 10 Gbps SFP+ can be had for $18. A 1 Gbps SFP is around $8. They continue to drop in price. With an SFP you can go 500 m tot 10 km, around $0,06 per meter for a pair of fibers.


You're competing with something that costs a few cents. $8/$18 would be more than the value of some of the devices were dealing with here, and you need two. This is why Intel has at least twice now promulgated early specifications based on fiber and then fell back to copper; manufacturers won't build the products because users won't pay the price.

There isn't a good solution here. People want neato fast stuff and they want it cheap, small and everything else that precludes good RF hygiene, and no one wants regulators interfering. It's intractable.


Seems like a lot of cheap stuff does not need to be fast, so USB2 is good enough. I would like to have seen something like those audio aux jacks with built in TOSLINK inline like the old MBPs had. Maybe a USB2 port with an optional optical channel somehow squeezed in would be really neat.


This $19.99 USB storage device [1] has a read speed over 3x maximum USB2. Next year it will be $10. The year after it will be $5. Or some such curve. Good luck telling people that can't have it because you say so.

Optical stuff has a curve too, but until it can compete with a bit of stamped metal inside a molding on price, it will be too expensive for USB. The market simply will not allow it.

[1] https://www.bestbuy.com/site/pny-elite-x-fit-128gb-usb-3-1-f...

Some problems are intractable.


Idk... I bought a USB-C disk drive that is faster than a SAN I bought in 2016.


>On the flip side, there was optical SPDIF way back when.

the bitrates is also much lower for SPDIF. According to wikipedia it only supports uncompressed 48khz 20 bit PCM audio, which translates to a bandwidth of 120KB/s. I'm not sure when it was introduced, but wikipedia says USB 1.x was introduced in 1996 and had a bandwidth of up to 1.5 MB/s.


TOSLINK (the optical SPDIF standard) originally had a max bitrate of 3.1Mbit (387.5KB/s).

https://en.wikipedia.org/wiki/TOSLINK


That probably explains why the soundbar connected to my PC via optical is silent if I set the output above 48kHz (24-bit works though)

I'm back to plugging headphones directly into the case, as the output from the jack on the screen was noticeably worse


I know that's is currently true, but at scale can they not bring down the cost for these optical interfaces?


Them main problem is that to get high bitrates over anything longer than 2m you probably need a glass fibre. And that increase the cost a lot, and makes the cables more fragile. Corning offers an optical USB3 cable where the optical converter fits in a slight-larger plug, it's not impossible. But most people wouldn't pay for it it when they don't need 50ft cables.


UTP seems to be doing just fine.


At 1m, the 10Gbit/s fiber cable with both transceivers is about 1.5x the price of a USB 3.2 cable.

Of course USB wants to also support dirt cheap devices that don't need all that transfer speed. So you probably need to put the transceivers in the cable, to allow cheap devices to ship low-speed copper cables. Then you have problems with bulky cables, in addition to the bending radius challenges of fiber.


Unfortunately Macbooks have an unresolved issue where the Bluetooth and Wi-Fi interfere with each other. I've tried Bluetooth mouses on my 2015 MBP and as soon as I turn on Wi-Fi, the mouse cursor becomes "jumpy" and unstable. It's so annoying that I switched back to wired mouses.

It's crazy that the issue was reported as back as 2011, but Apple didn't do anything about it.


It's not just Macbook Pros, for which I've also experienced the same issue, but the Mac Mini M1s have flakey bluetooth as well. Many people are experiencing random input lag with them. It's boggles the mind that Apple continues to ship broken bluetooth implementations on their machines.


I wonder if it's been fixed. I use a Bluetooth mouse with a M1 based Macbook Air, as well as with a 2018 MBP. I haven't noticed any issues with WiFi or with mouse in either.

I'm pretty sure I've also used that arrangement with both 2.4Ghz and 5Ghz Wifi connections.


I have an M1 MBA that uses a 2.4ghz dongle plugged into a USB-C hub from Anker to connect to a Logitech mouse and have no problems with it.

I also have an M1 MM that uses a 2.4ghz dongle plugged into a USB-C hub from Anker to connect to a Logitech mouse and it’s barely usable.


I notice I can get that behavior if I switch off or on the wifi radio and immediately move the mouse, but I don't notice anything otherwise.


Ah yes, the Logitech fiasco. It is a great story of how several electrical engineers designing separately, and not understanding software impacts, could make a sub-optimal result.

The big takeaway is that at 5GHz signals often "leap off the conductors" at the slightest provocation. And clock skewing and other attempts at breaking data/signal correlation have limited ability to counter this.

For a long time I had a USB 2.0 cable extender with an RF choke on it, that I would connect to USB 3.x hubs and then plug the Logitech transceiver into that.


Nothing beats a proper wired network with a proper wired headset. Let mouse and keyboard send their 10 bytes per second wirelessly.


unrelated but interesting, certain resolutions of HDMI on the RPi will interfere with the 2.4ghz transciever, effectively jamming it.

2.4ghz is like the duct tape of the electromagnetic spectrum...everything runs there.


It amuses me that the whole reason 2.4GHz exists as an ISM band, is that the band was too noisy to make use of.

Basically microwave ovens came before the ISM band. 2.4GHz is the sweet spot for heating water with microwaves - there's a few frequencies that work, but too high is expensive and difficult to shield, too low requires more power to get the same results.

So this band was effectively "written off" by the ITU/FCC as being too noisy to commercialise. The result is a band where you can do almost anything you want, as long as you emit less power than a microwave oven does. This makes it ideal for local-range applications that don't want to deal with the licensing requirements of 'real' bands - as long as they don't mind sharing it with microwaves. It's the typical story of a lightly-regulated free-for-all.

2.4GHz isn't a mess because everything uses it - everything uses it because it's a mess.


> 2.4GHz is the sweet spot for heating water with microwaves -

Sorry, but no.

2.4GHz being a sweet spot for water molecules is a myth.

https://wtamu.edu/~cbaird/sq/2014/10/15/why-are-the-microwav...


I didn't mean to imply it's a magic number - it's a sweet spot as a cross between power/cost/efficiency. The band wasn't authorized for communications until it was already pretty much spoiled.


It wasn't "authorized for communications".

It was always a junk ISM band, eg used for industrial applications that radiate interference.

Companies can use it on the condition that they accept any interference.


It happens the other way too. I once wasted few hours debugging why 4K mode wasn't working on some Mini-PC computer. Turned out that the cause was plugged mouse transceiver into USB port adjacent to HDMI...


We had lots of issues with USB3 cameras interfering with (RTK-) GPS receivers for a drone project [0]. When mounted on the drone, the receiver would just not get a fix, even in seemingly perfect conditions, i.e. unobstructed view of the sky, far away from buildings, no clouds, etc. One day, I randomly unplugged the cameras and suddenly the receiver started working. I repeatedly plugged and unplugged the USB3 hub just to make sure I'm not crazy. The GPS receiver would go from no fix at all to centimeter-level accuracy every time I unplugged.

We then used a spectrum analyzer to better understand the extent of the interference, tried shielding as described in the whitepaper, as well as separating the components physically as much as 30cm -- all without success. The only solution that worked was to replace the USB3 cables with USB2 cables and acquiring images at a lower frame rate.

I don't even want to know how many people have been affected by these issues over the years. USB3 devices should come with a warning sticker on the box.

[0]: https://github.com/lis-epfl/vswarm


So THAT's why plugging any USB3 external HD to my "factotum" home server breaks immediately all my home automation working on Zigbee.

And I thought the motherboard was defective...


Making a USB 3.0 or Thunderbolt device which doesn't kill WiFi when they are working together is really, really challenging, but still possible in practice.

Things can go as extreme as covering the entire USB 3.0 lane on the PCB with a solid RF shield from the chip, to the connector.

Things such as above preclude any chance at USB 3.0 getting into cheaper product niches.

I once looked for a good USB 3.0 testbench laptop to test devices with, but found out that laptops themselves have terrible USB 3.0 RF isolation.

I went through many laptops of reputable brands, but the only laptop I seen where USB 3.0, and WiFi were working flawlessly was a very old Sony laptop from 2010.


USB Thumb drives always mess with my wirelsess mouse. The first time that I encountered the problem was a huge waste of time.


Obviously this isn't going to be affecting a huge number of people in 2021, but if you listen to AM radio (I'm a bit of an anorak for Radio Caroline so I've been trying to pick that up) it's amazing how much interference modern devices give off. The monitor I bought last month absolutely wipes out 648 kHz, and Apple's Magic Trackpad 2 is a pretty bad offender as well.


Yep! I have a Jabra headset and I picked up a fancy newish Razer mouse that uses 2.4. Very odd system interrupt behavior slows the whole system down. I changed the mouse over to bluetooth and the problem goes away. It's not as EXXXXXXTREME as using 2.4 on the mouse, but hey - the system isn't so laggy now.


Interesting. I ended going back to 2.4 on my mouse as I'd have all sorts of problems with a Bluetooth keyboard, mouse, and headset connected simultaneously (on a Macbook Pro).


I ended going back to 2.4 on my mouse as I'd have all sorts of problems with a Bluetooth keyboard, mouse, and headset connected simultaneously (on a Macbook Pro).

I had a similar problem, and it turned out, I think, to be Bluetooth congestion.

I was in an office that was partially a call center, so lots of people on Bluetooth headsets. Plus all their mice and trackpads. Plus their keyboards. Plus regular headphones. Plus plus plus.

My desk faced out the window onto a public street. Very often, when a group of people would walk by my window, presumably each with his own Bluetooth devices my trackpad would disconnect from the MacBook.

The solution was to keep the trackpad plugged into the machine, since I never took either anywhere anyway.


I've been seeing the mouse/keyboard issue at home, mostly.

But I'm reasonably sure I've also experienced the Bluetooth congestion issue in a different context. I have an old, mediocre Bluetooth head unit in my car, and it disconnects from my phone ~90% of the time I drive in traffic. Works flawlessly on back roads and the like.


What? Bluetooth is 2.4Ghz. I guess it's just a proprietary 2.4Ghz radio protocol (Nordic Semiconductor low-latency proprietary I guess) vs Bluetooth (on 2.4Ghz).


Something like that, correct. According to the tech specs you have "Razer™ HyperSpeed Wireless" and Bluetooth as options. The hyperspeed thing is what causes system interrupts out the wazoo.


I have a cluster of raspberry Pis setup next to my SmartThings hub. I upgraded storage to use external SSDs via USB 3.0 cables. After that, all my ZigBee devices dropped communication with my hub. After some troubleshooting, I just moved my hub to a different room.


When I connect my external SSD to my MacBook, wifi dies because it operates at 2.4 GHz. I fix it by placing a metal plate (such as a mobile phone) over the wire between MacBook and SSD. It also helps to switch from the left side USB port to the right side USB port.


Anecdata: Amazingly, I had just minutes ago had to relocate my mouse's (MX Master 2S) wireless receiver because of this exact problem - and it's the example used!


Strange thing happened to me; I have a bluetooth dongle and headset. If I plug it into a USB3 port, it has a range of about 6". Fortunately my case has a pair of USB2 ports, and my MB has a matching header. When I wired those up, it works pretty much anywhere in the room. Since the BT dongle is a USB 2 device, I would have thought that the port would use the lower frequency and the EMI would be the same as a USB2 port. Not sure why it isn't.


I'm having a weird issue with a cheap projector I bought on Amazon, every time I power it on, all the devices in it surrounding stop working with WiFi. All my LIFX lights disconnect, my phone cannot see the WiFi signal anymore, all until I turn the projector off. Could this be related? I don't even know where to start diagnosing it.


The smallnetbuilder review of AC68u goes into detail on this. But I think newer wifi routers have fixed this that support USB ports.

http://www.smallnetbuilder.com/wireless/wireless-reviews/322...


Speaking of RF interference.

For the first time I bought myself nice headphones. And for some reason, unlike previous headphones, I can now hear very clear buzzing patterns any time my phone gets a Signal message.

I don't get it. It's not SMS, it's Signal messages over LTE. And the phone isn't connected to my headphones in any way.

And it just really diminishes my enjoyment of these things.


I'm guessing your headphones are wired? Move your phone and/or its charging cable.

If your headphones are wired into the back of a desktop, bad power shielding in your desktop itself can cause similar problems. An external dac can isolate your headphones from your desktop.


Oh you know what, I think you just helped me realise what might be the culprit. The cable in these headphones is like 8 feet longer than my cheap-o-phones and so I've run it around the back and to the side of my desk and the back out front.

As a result I probably changed how my headphone cable behaves as an antenna compared to the shorter cable that went straight from laptop to my ears.


I can't tell you how to fix it, but it is pretty common. You can put your phone on top of a speaker and hear some clicking a couple of seconds before you get a phone call for example.


It's all to do with cheap and nasty design and the watered down EMC Immunity standards.

Rather than meet an appropriate level of RF Immunity, the Manufactures/Importers lobbied the FCC to permit Lax standards and instead rely on a FCC Part 15 Compliance sticker which reads "...this device must accept any interference received, including interference that may cause undesired operation".


I used to get terrible GSM buzz on my speakers, but that was many years ago. Since switching to LTE I've not noticed it.


Yep, lots of interference with my bluetooth headphones. On Linux it gets a little better if you disable the wifi driver option for "bluetooth coexistance" (it's named slightly differently between the various drivers).

I don't actually use wifi, but I suppose it's got something to do with wifi and bluetooth being handled by the same mPCIe card.


Or rather (almost) the same frequency.


Learned about this recently - it seems semi notorious as a reason for devices like Zigbee transmitters to have soft failures.


Thank you intel. Had to wrap the cable for my my iogear 3 port USB-C hub in a electrostatic bag to keep it from breaking wifi on my butterfly macbook pro for just this reason.


Pretty sure I encountered this on my Raspberry Pi 4 with WiFi and USB 3 devices. I don’t have a link to the forum, but I thought I remembered a discussion



Do all these devices therefore violate FCC rules about interference? Should we be asking the FCC to fine these device manufacturers?


Presumably this would also be true of USB 4 and Thunderbolt 4 (both which support USB 3.2 Gen 2x2), then?


Possibly, but USB-C connectors are generally much better shielded than USB-A.


So it's connector-specific? In other words, if I have a USB cable with USB-A on one and and USB-C on the other, the USB-A connector will be the thing introducing RF interference?


No. There's a long list of steps that a designer can take to eliminate EMI.

Things like good shielding, appropriate PCB design, differential cable drivers, and common-mode chokes.

If it were a good design, it would work satisfactorily with generic cables and connectors. And if it does require specific connectors (etc) it should not have removable cables.


Not necessarily, but it's slightly easier to make properly shielded USB-C connectors than it is for USB-A. I'm sure plenty of manufacturers will screw it up.


Wonder if this could be relied on for fingerprinting, exfiltration, or other creative uses...


Probably. But I'm sure it's very short range. You can already theoretically snoop on someone, albeit usually with the resources of a nation state or maybe less, via display RF noise, keyboard noise (both RF and audible), powerline noise, etc. So much of what we do technologically is very RF noisy.


Had this problem on my raspberry Pis with external SSDs


I wonder if this can also interfere with Ant+?


Same frequency band, so probably.


TL;DR: Your Bluetooth dongle doesn't work because you've plugged it into or right next to a USB 3.0 port which is jamming it to bits. Get any decent USB extension cable and connect it to a USB 2.0 port and everything will be fine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: