Hacker News new | past | comments | ask | show | jobs | submit login
Wireless Is a Trap (benkuhn.net)
592 points by finolex1 on June 21, 2020 | hide | past | favorite | 419 comments



I totally agree with the writer, I have dedicated the last ten years to specify and implement a new single wire communication protocol for IOT and home automation applications called PJDL (Padded Jittering Data Link): https://github.com/gioblu/PJON/blob/master/src/strategies/So...

It performs much better than wireless alternatives.


16kbps over 2km, wow! How sensitive is the system to the quality of connection?

If I hide enamel coated filament just beneath my lawn is that going to be good for a one wire bus to work? Does it even need to be insulated?


It works even if you use the human body or water as a conductor, as I showcase in this demonstration (sorry for my strong italian accent): https://www.youtube.com/watch?v=GWlhKD5lz5w

This is the implementation: https://github.com/gioblu/PJON/tree/master/src/strategies/So...

This is the source code: https://github.com/gioblu/PJON/blob/master/src/strategies/So...

It is extremely rugged, it works much better than any alternative I have ever tried. Ah, it is free and open-source ;)


Looking at the interest my comment generated I decided to create a post here on HN about PJDL, please if you liked up it: https://news.ycombinator.com/item?id=23591953


The Italian accent made me think of Marconi, also your accent isnt heavy at all, just adds a little bit of color.


That is really a good compliment :)


The awesome thing with wireless IoT is that it can be set up without any additional cabling, which can be extremely painful process in most homes.

In case of wireless all you need is ESP32, relay and a power supply. And all of that together will cost less than 10$.

Can you easily (cheaply) use the already existing power cables for PJDL? I know there are network repeaters that use power cables to transfer data, but I have no idea how much it would cost in IoT scenario. Do you have some examples?


Yes many users have squatted old phone wiring, here in italy in most houses it is embedded in walls, and it is present in most rooms. Being unused and obsolete it can be used for home automation with PJON :)


As a software person who doesn't know much about this sort of thing: how does this work electrically? Presumably there's no current flowing as it's an open circuit? Is it akin to a voltmeter using the ground as a point of reference? Why don't other wire-based protocols use the single-wire approach?

Does it have anything in common with 'single-wire earth return', in power-distribution?


The protocol is really simple +5v is 1, 0v is 0. The circuit is just a single conductor and a common ground pin. If you connect a LED between the bus and ground you can visually see packets in form of light pulses.


Here's why it's a good idea to insulate underground cables or use optical fiber: https://youtu.be/Ev0PL892zSE?t=357 (TLDW: the salient point is at 7:20: https://youtu.be/Ev0PL892zSE?t=440).

Though, I don't know if insulation helps in the case of lightning. Apparently with Ethernet, special grounding thingies help.


Yes, in production it is suggested to use current-limiting resistors, pull-down resistors, a fuse, a zener diode or a TVS or a varistor. Using all the suggested protective circuitry in the worst case the fuse should be the only component to fry in case of direct lightning strike.


Lightning crosses several hundred meters of air at the very minimum- what makes you think it won't arc over a few millimeters more, if that's the path of least impedance?

In general even the most rugged lightning/surge protection only decrease the likelihood of protected equipment being vaporized.


I agree that in some cases there is nothing to do and damage will be done by the strike and there could be no protection able to prevent it, specially if the equipment is directly hit. I would have corrected my previous statement but it cannot be edited.


have you worked with VDSL modems and lines?

it can be a nightmare.


Looks cool. Why single wire and not a simple twisted pair? Are you using a voltage relative to earth for your logic states? Just not sure how it could work otherwise.

A twisted pair seems nice to me because you can use differential signaling and not have to ground your devices e.g. 2 wire RS485.


Yes the voltage is relative to ground for the logic state, the differential signaling requires a source of negative voltage and a more complex front-end, being relative to ground you just need a digital IO operating at 5v or 3.3v and a ground pin, no additional hardware (although current-limiting and pull-down resistors are suggested), as a result it can be fully and easily bit-banged. The twisted pair (where one wire is the bus and the other the ground) is one of the suggested approaches that helps to reduce the interference induced on the wire.


Radio systems have an antenna that radiate energy into the atmosphere. Other antennas resonate in harmony.

What if you just connect the antenna wires of all the devices together?

That's what a single wire protocol does.


I'm not really knowledgeable so excuse me if I have a misunderstanding as I try to reason about this from a different perspective.

When you have an object with static electricity and put it in contact with a different object, there will be a transient current, even if the circuit is not closed. The objects basically work as poor-man's capacitors.

Even though the name is "static electricity", when the objects come into contact, it's actually more dynamic than normal, because there is no current between two objects of fixed and different voltage. The more current that flows, the lesser the voltage difference becomes, in an exponential fashion.

This kind of single wire protocol operates solely based on these kind of transient currents. Because the frequency is so high, very little current flows during one period of the waveform, so the receiver will maintain constant potential, as if it was grounded.

That it isn't actually grounded means the signal becomes high-pass filtered. In particular, the DC-component must be completely blocked as it would lead to charge buildup with nowhere to go.


> Dongles. Even though all computers now have built-in Bluetooth, many Bluetooth accessories today still ship with proprietary dongles. I assume this is because the manufacturer was worried about inconsistencies or incompatibilities between their own Bluetooth implementation and your computer’s built-in Bluetooth hardware/drivers.

I've never seen this before. Is the author mistaking proprietary 2.4 GHz receivers (that do not use Bluetooth) for OEMs' "own Bluetooth implementation"?

They refer to swapping between built-in Bluetooth and the dongle, which sounds like Logitech's line of dual-mode keyboards that can use either Bluetooth or Logitech's proprietary "unifying receiver," which is based on Nordic Semiconductor's nRF24L, not Bluetooth.


Bluetooth implementations are so stupidly quirky I can totally see proprietary dongles as being required to guarantee connectivity.

Fitbit won't pair with my phone, but works OK with my wife's and my tablet. Garmin fitness tracker syncs with all 3 devices, but will stop syncing with my wife's phone within 2-3 days and will need to be paired again.. My bluetooth headphones work with my phone and my wife's phone, but not my tablet. Every 6 months or so I still have to use a paperclip to reset the headphones to be able to re-pair them with my phone after they stop working for no reason.

I've seen similar problems with phones I've owned before.


Another example of this is with 'Windows Mixed Reality' VR headsets. The controllers connect with the computer via Bluetooth and will often be noticeably laggy or unreliable when using onboard Bluetooth due to interference from other devices on the motherboard. Switching to an external Bluetooth dongle on a USB extension cable results in significantly improved controller tracking, and this issue is partly to blame for the poor reputation of WMR controller tracking.


I suspect that's one of the reasons Valve put a Bluetooth receiver inside the original Vive headset.


They also didn't use it to connect to the controllers. They used two nRF24L01 transceivers which use Nordic's proprietary shockburst protocol. Easier to set up and better latency.


Yeah, the second-gen[1] Samsung Odyssey+ headset likewise has a built-in dedicated receiver in the headset and it's a significant improvement over the first gen. (I like the first gen better in practically every other way though.)

[1] There's much debate about what 'first gen', 'second gen' actually mean in terms of VR, I just mean it's Samsung's second WMR model after the Odyssey HMD.


I remember a friend had a PDA with Bluetooth over 15 years ago. It's dumbfounding that after being around for so long it's still such a train wreck even on mainstream products.


If apple made a new protocol called APAN (apple personal area network) where they get rid of all the BS in bluetooth and did it properly, kept it proprietary for 5 years and then released it to the general market, I bet they will sell like hotcakes.

Make AirPods 3 APAN & BT compatible, with the same latency characteristics of theatre wireless mics (really low). Same with the mice & keyboards.


Bruh, they did! They call it „W1 chip“ and its in airpods and beats.


It's still BT inside. The W1 adds some convenient new pairing logic. It's not a new stack.


I have the same problem with bluetooth in my car, it unpairs my phone at random


What the author means: if you ever worked with actual Bluetooth you'll see that it's full of insane quirks and vendor bugs. This opens up the risk that your product will be seen as bad because the user has a crappy BT chipset in their computer. Just see the debate about Bluetooth headphone latency here on HN: even engineers don't realise that experience is very much dependant on what kind of chipset they have on BOTH sides.

This is why manufacturers build their own 2.4GHz proprietary dongles using proprietary protocols - it ensures consistency for the users.


Supposedly the drivers are also full of hacks. Blacklists and Whitelists for which devices actually support certain features. A device might say it supports a specific codec, but then when the phone starts streaming audio it causes the headset to lock up.

It's a bunch of hacks all hacking around the other hacks.


The BT protocols are quite complicated and therefore the different implementations will be full of quirks IMHO. Try debugging an audio BT issue on a phone and you will see the amount of complexity.

Interoperability is tested, but not in an organized way and not repeatably - AFAIK 3 different teams are supposed to check that a new protocol works against each other. The different vendors verify their BT implementations but not against each other. That would be expensive, I guess.

The firmware in BT devices can also use a number of hacks which are not breaking the spirit of the specs as such .. and this is not being tested. On top of this often BT and WiFi has to work together (shared antenna) and this means that BT will not get all the time it needs but a percentage based on what Wi-Fi is doing at the time.

I try to use Ethernet every time I get the chance.


> it's full of insane quirks and vendor bugs

As someone not familiar at the low level, can somebody explain how this can still be happening today? Are there not libraries that help ensure proper implementation? I really don't understand.


There's multiple effects going on. Firstly the bluetooth spec itself is quite complex: the 'core specification' document is near 3000 pages of very dense technical details, because it's aiming to allow anything to connect to anything. Secondly, many bluetooth devices are both heavily power and cost optimised: you don't have a lot of silicon and you don't get to use it very often if you want your product to have a reasonable battery life. This is especially true of the raio in a bluetooth device, and adds a whole other level of complexity to your implementation, and is one area where bugs are very easy to introduce. It also requires a fairly complex dance between hardware and software (most of the bluetooth spec is implemented in software, usually C. The hardware mostly consists of a radio capable of transmitting on the right 2.4Ghz signals with the right modulation scheme, but when you add power management to that it gets gnarly).

Thirdly, hardware companies are generally not great at software. A lot of effort goes into physical testing and validation but the base level of quality of the code isn't great. Also the quality of a given bluetooth device often depends on the quality of the custom code for it: there's a generaly trend of running application code on the same CPU as the bluetooth code, and in an embedded context there isn't generally great seperation between the two (nordic's approach is probably the best here). A third-party making a cheap pair of bluetooth headphones is not likely to make a quality result, even if using a library which is relatively bug-free.

Finally, there's a vicous sprial effect: because you need to interact with a wide variety of other implementations, which may all be buggy or quirky in their own ways, you wind up needing to test with a wide array of other devices (expensive and may not always happen) and play whack-a-mole with the remaining issues which appear, which in turn creates more quirks and more opportunity for bugs.


Hardware makers basically can't write good software.


That doesn't explain why Bluetooth is the protocol that causing the most trouble. Proprietary 2.4Ghz devices appear to work just fine.


A Bluetooth device is much more complex than the proprietary wireless devices. The complexity is largely because Bluetooth hardware must support many different types of devices all with differing requirements. A proprietary keyboard dongle doesn't have to be able to deal with the audio traffic coming from a microphone or going to a headset. Furthermore, it doesn't have to support proper identification and authorization of each individual device; many of them will accept input from any compatible keyboard that happens to be near by.


Not an expert in this kind of low-level stuff, but bluetooth is mostly implemented in hardware. I don't imagine you could get a compact lightweight headset that can last 10 hours on a charge if it needed to run even something as lightweight as a C library. And obviously, you can forget something like Airpods.


No, bluetooth is pretty much entirely in software. I say that as someone who has worked on Bluetooth devices at CSR/Qualcomm. Embedded software quite routinely runs hours on batteries. Indeed, some (entirely software) devices, last many years.


Are you basing this off of an assumption or do you have a source?


I think the author might not know that the dongles that Logitech ships are not Bluetooth. They use Nordic's nrf24L01 chips, which run a much simpler protocol, very fast and robust for simple communications.

I use a Logitech MX Master, and while I do enjoy its Bluetooth capability (for laptops), I use a dongle with my desktop, which results in zero latency, no stuttering, and perfect synchronization.


I think the primary culprit is the power management on both the host and client

I have an MX Master and a K380. The MX Master has a noticeable lag in Bluetooth and both will have a noticable delay when first used after a period of inactivity before springing to life. The K380 seems to have a key buffer to mitigate the delay.

If however I have my Bluetooth headphones powered on and active to the host, keeping the bt radio active, most of the issues go away.


Then there's also Bluetooth Low Energy, which is Bluetooth only in name and mandated in the 5.0 version. Apparently most news products use BLE instead of BT, unless they use a well-known profile for say, audio or game controller accessories.


I think this differs per keyboard. The dongle of my Logitech diNovo Edge is Bluetooth.


Right. Those old diNovos are a nightmare as well and not only did they ship with their own Bluetooth adapter but required you to install their driver which would install its own Bluetooth stack (the Broadcom one and they used to use I think BlueSoleil before that) over the Microsoft one.

I still have nightmares about those things failing to pair constantly.


It's many years ago so may be inaccurate, but I remember the dongle being the one stable solution. With the onboard BT of a raspberry pi on Ubuntu (or was it Raspbian?), recurring issues. With the dongle, perfectly stable, and IIRC no drivers not in the mainline repos.


The 'Logi Options' software has been buggy for me on Mac Catalina and constantly stops remembering my custom MX Keys or MX ergo/MX Master shortcuts until I relaunch the app. Anyone else dealing with this?


I simply stopped installing the terrible Logi software. I'm fed up with the crap, the resident CPU-eating daemons, the insistence on reporting what I do to Logitech or registering online for a cloud account — enough of this garbage.

Fortunately, the mouse works just as well without the crappy software, once you get it configured.


The reality is that if you poke yourself out of the Mac & laptop world, there's plenty of desktop computers without integrated Bluetooth support, and Bluetooth <-> USB adapters are pretty cheap, so vendors usually ship with one bundled to ensure they don't have to deal with tech support issues from people thinking they have Bluetooth support in their computer when they don't.


Somewhere on HN in a similar discussion a bluetooth stack engineer showed up and explained that the entire bluetooth comms stack is a Rube Goldberg house of horrors, so this really no surprise.

And then there are the bufferbloat people who finally tracked down the fact that in various common situations, the normal Wi-Fi protocol has a nasty interaction where it self-degrades to ridiculously slow speeds, and will take 2-5 MINUTES to recover.

We just don't have enough people doing serious whole-stack hardware+software analysis on our wireless stuff. At least outside of cellular telephony - probably because the money is more concentrated there.


I maintained a commercial Bluetooth stack for several years. My opinion is that the stack is actually not too bad (although Low Energy complicated it). The problem is that it has to be all things to all people.

There's discussions about codecs elsewhere on here - quality, latency etc. There is a lowest common denominator so that everything works for everyone but the information as to whether you're getting something else is hidden, both at purchase time and run time.

Two things that would help:

Manufacturers actually putting supported codecs somewhere in the purchase information. Why does tech specs on my Bose QuietComfort not have that information anywhere? If you Google you can eventually find a support question and answer.

Exposing runtime negotiation results to the user. In Windows when I'm connected to Wifi it shows signal strength and other basic info via the icon in the taskbar. When I'm connected to Bluetooth why is there not a display listing the profiles and codecs in use ("High Quality Audio mode via Aptx Low Latency", "Headset mode" etc.).

We actually need something similar for USB now as it has the same problem as Bluetooth - one transmission medium for a whole lot of different things all of which have to be able to negotiate to a common denominator. It happens a little bit with just the charging component (phone lets you know it's charging slowly if the charger can't negotiate up enough) - why isn't similar information provided somewhere for the rest of the USB features ("Connected to device X, can't enable feature Y as cable doesn't do Z").


That's great for techies like us, but terrible for the 1000x more people who want stuff to "just work".

As an example, Windows partially exposes that with my current setup and it's a right pain, when some apps (e.g. Zoom) decide they preferentially want to use some other bluetooth codec than the one I actually chose.

Most people really just want one thing which says "headphones" and for the software/hardware to Just Work.


Maybe most people are wrong after all.

There's all this talk of ethics and of logical fallacies that's making the rounds, and then you have this failure to realize that non-techies are conditioned to see technology as some sort of black magic that you have to be a special kind of not-quite-person in order to be able to make sense of.

No matter what spin you put on it, life in our society is not simple. Any kind of tool tends to work better if you're using it with some degree of understanding how it works.

If only our industry would somehow magically stop enabling entitlement and complacency (and make it easier for users to "do their homework")... then maybe things would actually "just work" much more often.


You're missing the actual logical result of life in our society not being simple: people do not have enough time or available energy given the rest of the demands on their life. Civilization has evolved toward specialization, and the basic expectation of specialization is that the output of specialists is useful to individuals in other specialties. It is foolish to expect or require everyone to become an expert on bluetooth configuration options to have some headphones that don't work or a mouse without latency spikes. This isn't entitlement; this is efficiency and the way society is expected to function. It is a very long time since we have expected everyone in our society to be an expert at everything.


Good point - however, then at least have an option for specialists to get access to the info.

I also believe we haven't yet explored all options to make certain "specialist" topics more accessible to the general public. E.g., I'm continually surprised how much the actual network traffic between apps/devices and their corresponding servers is hidden. I believe a simple visualisation about what apps/devices are talking with who would do a lot to at least get some basic understanding of what's going on.

People seems to be very capable of learning concepts if they see personal relevance - e.g. teens know very well what the battery, wifi and signal strength indicators on their phones mean.


> e.g. teens know very well what the battery, wifi and signal strength indicators on their phones mean.

I will disagree with this whole-heartedly. Full bars are meaningless, and dBm is hidden (on Android at least). What we see is whatever the OEM chooses, with actual signal strength hidden away.


Also, RSSI is typically calculated from the successfully received frames. So you can receive a comfortable -60dBm signal.. for the most of the time.


Yours is a common argument which has a lot of merit, especially to quell naive youthful idealism. However I can't help but wonder if it really is so easy to untangle cause and effect, especially considering evolutionary incentives are also subject to change.

Consider we're not just already failing to build products that "just work", we're making it hard for users to make things work at all. There is obviously an incentive for that - Sinclair's law. However, imagine developers were actually incentivized to design beautiful, internally consistent systems, and expose their inner workings in an accessible way. How do you go from this proposition to the conclusion that everyone would have to become an expert in everything to do anything?

If everyone had access to the information needed to fix their problem using basic reasoning, but wanted to do other things, they could be able to find a nearby person who knows a little more and can help - as opposed to wasting time posting in support forums, waiting for customer support's non-answers, having to learn Google-fu to find if someone had the same issue, and eventually throwing the goddamn thing in the trash.

As an aside, I do have an axe to grind here; I did buy a pair of Bluetooth headphones the other day and, and do like chucking them in the bin even though they just work, just because I'm sick of being locked into yet another half-arsed wireless technology. I'll probably just give them to someone less demanding, because I don't want some landfill dwellers on the other side of the world inhaling my headphones. I've also reached the point when if anyone asks me for help with computers I recommend trying pen and paper instead.

Anyway, the fundamental distinction I'm trying to make is ternary: - having to learn things that make sense (competency and proficiency are enjoyable), vs. - having to learn things that pretend to make sense but don't (being good at those also feels good but I suspect is devastating to mental health), vs. - outsourcing all of your learning and going on with your life (until the Morlocks eat you)

History in general feels very much like one of those over-engineered legacy systems where every change is slow and painful, and the current maintainers can only guess why things have ended up being in a certain way, and hope trying to fix them doesn't suddenly make them worse. If you only have pure human universals to deal with in your life, I envy you. Otherwise, you might need to acknowledge that a lot of the "demands of life" are simply arbitrary constructs that beg to be challenged, renegotiated, optimized.

Maybe everyone becoming an expert in Bluetooth configuration or whatever is not pleasant or desirable, but in my book it beats destroying our economy, species or ecosystem by a thousand small cuts.

TL;DR: "Specialization is for insects," anyone?


But as Alfred North Whitehead wrote in 1911,

"It is a profoundly erroneous truism, repeated by all copy books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them."


It's an absolute explosion of complexity. I'd have to talk for nearly an hour to explain the Bluetooth modes, and then longer if the user didn't understand radio interference. It's too much. It's like having to be able to tune a carbourator to run a car. It's a skill very few people have, and having less of it is actually good.


Explosions of complexity tend to happen when complexity is poorly managed in the first place.

If you need the person to understand Less need for specialized skills/info is good. Less availability of that info when needed, I just can't see how that can be good.

An hour of coherent explanation by a friend is worth more than an hour of pointless frustration at a company, no?

Correct me if I'm wrong (oh no, needing specialized info to make a general argument!) but modern cars don't have carburetors anymore, because it turned out to be an evolutionary dead end of sorts?

Maybe the same thing is bound to happen to Bluetooth anyway; and I believe that the better its workings are exposed to those who want to know it, the sooner people would be able to replace it with something that works better, thus generating less waste - of both resources and time


Idk where the "If you need the person to understand" part came from


There are better and worse ways to build the UI around this information, but to say it should be completely hidden from / impossible for the end user to find, is just outright hostile to the user.

Do you think people would be happy if the wifi status icon just became a boolean "on/off"? Of course not. They are able to make differential decisions in light of the (limited) information presented.

Empower users with information!


> Most people really just want one thing which says "headphones" and for the software/hardware to Just Work.

Sadly apple removed that option from phones, and other companies followed suit


There's a difference between making diagnostics and status info readily available and actually letting the user change the inner details. I think the point was that the OS could expose the result of negotiation so that it was easier to inspect later, either by a technical user or a support person helping a less technical user.

I like the idea and hope some MS engineers are taking note.


> Exposing runtime negotiation results to the user.

General peeve engineers hide this stuff from the user because they think they won't know what to do with it, it'll just confuse them, or it's ugly. My experience is this is completely wrong.

Also I think a problem with USB and Bluetooth is mixing up the device profiles with the communication protocol. That's always a bad idea yet people keep doing it. Half these problems would go away if USB and Bluetooth were like Wifi/Ethernet. Just a packet level protocol.

Also making Bluetooth a low power/datarate frequency hopper was a terrible mistake. The people responsible should have spent the rest of their lives shoveling hot asphalt.


> Half these problems would go away if USB and Bluetooth were like Wifi/Ethernet. Just a packet level protocol.

Every few years I have this thought:

If, instead of <X> on <Y>, we had just put more effort into Ethernet (and just Ethernet), we'd probably have reached <Z> by now.

Fiber Channel. USB. Thunderbolt. HDMI/DVI. SCSI/SATA. Bluetooth. BLE. Make wifi be ethernet over radio not the monster it is today.

Of the above, only Ethernet has the good grace to be driven by (relatively) simple software. It's also the only one that isn't getting obsoleted by the next complex thing coming along.


One pain in the ass thing I had to do was write a driver to support talking to MODBUS devices. MODBUS is a 40 year old register based protocol. It's not going away. Most all thr other mixed protocols of it's vintage are long long long dead.


Are you talking about this?

https://www.bufferbloat.net/projects/bloat/wiki/What_can_I_d...

P.S. SFQ is enabled part of QoS (on my asus86u). There are enough people working on WiFi, you just have to invest money/time to buy/configure proper AP hardware.

In cellular, AP is managed by a team of engineers, and in consumer WiFi space, it is a race to the bottom.



Most off-to-the-shelf Bluetooth stacks are pretty terrible... the only reason ours was decent was become we built it from stack before Bluetooth was so pre-dominant. Ours had some issues because we were behind the spec (the guys at the top only wanted the features that supported the spec, no cutting corners, no pretending the features worked, it had to work correctly)

Source: Worked Radio Verification at a mobile company that developed it's own bluetooth stack. RIP Blackberry.


Does anyone have a link for this Rube Goldberg house of horrors? Sounds terribly interesting.


What's with the hate against wires? Wires always work. My wireless printer never worked. Well it prints once, then I have to power cycle it to print again. Thanks Samsung.

The minimalism point doesn't really cut it: you still need a wire to charge batteries inside things, so it's one more item. And batteries go bad in a matter of years, so you need to buy a soldering iron and a LiPo cell from shady sources unless you're into replacing >$100 devices every three years or so. Or buy external AAA batteries and a wall charger like 90s toy cars.

I don't mind the aesthetics of wires. I love the mess behind my desk and the thought that electrons are being pushed at 40 Gb/s to form an image. Mix and match different colors and braid types for a free steampunk look.


Personally, I'm also a big fan of wires. My partner however, she fucking hates them.

I can try to run-down her thought process for you:

1) Wires look untidy

This doesn't have to be true necessarily, but making wires neat is an art form, having a wire that goes behind the sofa from the TV for example is something that will never easily be hidden.

2) Wires are hard to clean/collect dust

Wires that don't have that fancy plastic routing plates seem to accumulate dust, and are hard to clean around generally. Loose wires that spiral are the worst but even single cables running across the edges of the room seem to be harder to clean.

3) Wires dont "add any benefit".

For her, if it works it works, and when it doesn't (and I explain that it's because the signal is poor or there is interference) then her reaction is: "You're smart, you work with computers, make it work."- so I do ugly things like run powerline over ethernet which has it's own cons.

--

There is a generalised ignorance about how wireless communication systems work in the non-tech populace, nobody thinks about a 'shared segment of space', they think in terms of themselves, their router, and principally "their connection at the time" to the wifi router.


I fucking hate wires too, so I'd add the following to your partner's complaints:

4) Wires make the physical environment much more shaky.

I have a cat. Like all cats, he occasionally gets rambunctious. When he goes zooming around the desk, the more things that are connected with wires, the more likely that something gets yanked by something else into a catastrophic (as it were) orgy of destruction.

Of course, one could just unplug everything that isn't in use, in which case one runs into:

5) Wires create complicated organizational challenges in practical use

For example, many of us have more devices than one has things to plug them into (whether that is outlets to plug into the wall or USB ports on the computer or whatever); changing devices means rooting through a pile of cords to figure out which one happens to go into which thing that needs plugging in. Wires also impose location limitations on the things they're attached to, as extra-extra long wires sometimes degrade data quality too, or are spec-violative because of some kind of safety consideration[1], or impose other ancillary safety considerations such as the risk of tripping, of putting furniture on some power wire and fraying it like crazy, further cat dangers, etc. etc.

Maybe in an industrial environment, where the entire physical plant can be adapted to the needs of the equipment, it makes sense to wire everything. But in a home, where people have needs independent of servicing their devices and don't have the money to, for example, hire someone to cut holes into the wall to run all the cables through there, wires can seriously degrade a bunch of realistic practical considerations.

Source: myself, a person who has recently moved to a building that provides free cable, and hence has cable TV for the first time in his life; has now discovered that in order to make a basic cable tv + internet setup work, one needs 3 wires into the TV, 4 wires into the modem and a splitter on the cable jack, 3 wires into the cable box which apparently is also a TV, and, since modern TVs have shit speakers so a sound bar with 2 or 3, I forget which and can't tell in the wire forest, wires into that. The cable wire is run entirely across the living room, since apparently the only place to connect to the cable company is on the opposite side of the living room from where any rational person would put a TV, all resulting in this horror; [2] all of which is powered buy a single duplex outlet with multiple surge protectors in it, which, at least if you believe the Yale fire marshal, is a no-no[3], but other random places on the internet disagree and, any way, what choice do we have in the Land of Endless Wires?

[1] For example, USB-C PD apparently has a limitation of 6 feet in the spec? https://superuser.com/questions/961176/usb-type-c-power-deli...

[2] https://imgur.com/0UWlCYW

[3] https://www.ucop.edu/risk-services/_files/bsas/safetymeeting...


Wires can also be neatly snipped by cats that take a fancy to doing so. Ironically the first thing he did this to was a wifi antenna!


OMG. Don't give my Leonidas ideas.


>he minimalism point doesn't really cut it

You're completely correct, but I'd go even further: if you actually understood how wired and wireless work, you'd realize that the wired solution was truly much more minimal: less complexity, less cost, more reliability.

Wired is only more minimalist if you're looking at the wires and not thinking about the technology.


My only argument against wires is the inevitable wear and tear. I don't know if this is just me or my poor choices regarding equipment, but I regularly have to replace USB, RJ-45 and 1/4 jack(guitar) cables because they simply die on me.

Same goes for sockets - recently I bought a certain laptop because, among other things, it can be charged also via USB-C (albeit slowly), giving me a fallback socket should the primary fail - had this happen to my previous device.


What are you doing to your wires?

Aside from Apple brand cables, which lack strain relief, I don’t think I’ve ever killed a single USB or Ethernet cable. If it weren’t for the transition from USB 2 to 3, I’d probably keep the same cables for my entire life at this rate.


I ask myself the same question every day.

Anyway the USB-C I got with my phone lasted two years, but I've used it everywhere, my 1/4s broke because I've made the mistake of buying "by Klotz" cables, RJ-45 are mostly Chinese crap so no surprise here.


Maybe you’re pulling by the cable and not the connector? Some of the USB C connectors have very small connectors, making this an easy problem.

The more I think about it, I’d probably destroy lighting cables regularly if it weren’t for the fact that I only use wireless charging. You have to unplug that cable a lot, which is pretty hard on the cable and connector.


I twisted my headphone wires too much. I had to take the headphones apart botched a soldering job and then just put two wire ends in a ferrule as a last resort. It works but I wish Bluetooth was actually worth using instead of hacking a solution together. Back when Apple removed the jack I thought wireless wasn't ready yet and my opinion hasn't changed in 2020.


In my experience Bluetooth appears to draw as much power as the headphone driver, so with wireless headphones one is actually using three times the energy which is precious in a battery-operated mobile device.

I got wireless only when I'm doing manual work like chores. In every other instance it's just not worth it.


Similarly, I prefer wireless charging on my phone and other portable devices. No worry about wear and tear. For about a year I used a tablet with a broken microUSB port that I used a wireless charger for.


Efficiency of wireless charging is just 62% https://youtu.be/zLrSM4ruNw4


Replace one lightbulb with a more efficient one and you'll compensate for that.

If you still feel guilty about the extra 25 cents of electricity you'll waste per year, go buy a $5 carbon offset and donate $5 to an environmental charity.


How much energy is needed to manufacture a micro USB charging cable? I've had to throw more than half a dozen into the trash because they broke or got too loose.


Wires don't always work.

If you think wires always work, you've never dealt with the vagaries of SCSI or FibreChannel or even 10gigabit ethernet. Hell Benson Leung became net famous for reviewing all the problems with USB cables (https://kirkville.com/google-employee-reviews-usb-c-cables-o...) let alone the insanity between USB devices (and this was before the whole USB 3, USB 3.1, USB 3.1 Gen 1, USB 3.1 Gen 2, and USB 3.2 fiasco).


> "My wireless printer never worked. Well it prints once, then I have to power cycle it to print again. Thanks Samsung."

My wireless Samsung printer user to work very well, but lately, various devices keep claiming it'a not connected. Turning it on manually doesn't fix it. Since I'm near the printer anyway, it's usually easiest to just hook up by USB. (This is hardly the only problem with it; sometimes it's just in a "paper jam" mood and I need to "fix" a non-existent paper jam after every single printed page. The fact that support is now by HP doesn't exactly endear me to the machine either.)


I decided a long time ago that when I buy a house, I’m running Ethernet cables in the walls.

WiFi is fine if you’re gonna meander around, but I’ve had enough nonsense with the signal cutting out while I’m sitting still. I’ve also had enough with my speeds degrading 90% from one end of my tiny apartment to another. Give me my copper cables back please.


Of course it's nonsense. Lots of lagging, jittering and interference. People should use cable whenever possible. But Apple marketed wireless as being 'revolution' and suggested cable-less workspace and people liked it.


I like wires but I am prone to tripping, even so I could probably get used to wires but I have very small kids who constantly pull out, chew on, run across, etc any wires they find exposed


The Bluetooth headset latency kills me, especially now so many devices are losing the headphone jack. As a hobbyist musician its unusable, same issue for games.

It’s just worse in every respect and the headsets cost three times as much for the equivalent sound quality.


I agree for music creation or gaming, but in general wires are definitely annoying and limiting. I could never give up my Bluetooth headphones now for everyday use - wireless plus noise cancellation makes such a big difference when commuting or in the office.

The weird thing is, low latency wireless audio transmission is quite common. Like those used for in-ear monitoring on stage, or wireless guitar cables, that just broadcast over rf frequencies. I don’t understand why no one sells headphones using technologies like this, even if it meant a dongle that needs to be plugged into your computer as well.

Interference would probably be an issue if a full train car of people were trying to use this or something, so I get why Bluetooth is a better, more standardized default. But I would love to have the option of an alternative. I would definitely buy low latency wireless headphones to use while playing music.


There are wireless headphones using other technologies than Bluetooth. I'm currently using a Steelseries Arctis 7 and it has no perceivable latency (I play rhythm games with it).

And it indeed comes with a USB "dongle"/base-station, with a cable and about the size of a watch, so not really usable on a train etc. (I use wired headphones when working away from my desk).


The best part is that the headset can be connected with a wire to both Xbox and PS4 controllers.

Also the custom dongle has an 3.5mm output plug for speakers, which will automatically turn off when you switch on your headset.

If only Steelseries would make noise-cancelling headphones...


// low latency wireless audio transmission is quite common.

The new Qualcomm aptX adaptive standard supports lowish latency - ~80ms.

list of official supported devices: https://www.aptx.com/product-listing?aptx_type=336

It might also work in general , with android 10 devices, but this need verifying.


The GGP says he is a musician--80ms is going to be much too high.

More niche, but close to my heart, it also basically kills rhythm video games. Those can mostly compensate for display latency, but compensating for audio latency is more troublesome--particularly if the exact amount of latency isn't constant.


To follow up, in most rhythm games, the "perfect" timing window is around +/-33ms. In DDR, the "marvelous" timing is around +/-16ms. 80ms is almost a combo breaker.

Top level players can stay in the +/-33ms window for thousands of notes without making a single mistake, and "full marvelous combos" where you stay in the +/-16ms window for an entire song are not unheard of.

That's the kind of timing we are talking about.


...it's interesting to note, though, that because DDR doesn't provide auditory feedback when you hit a note, it should actually be able to compensate for audio latency quite well. The big caveat being, you have to know what the latency is, and that latency must stay consistent, neither of which apply to Bluetooth headphones.


80ms is a roundtrip across the US, that may be low latency for bluetooth, but it's not low latency for the world.


As a musician I'd call anything ≤10ms low, while 20ms would be okay and usable. Something like 80ms would just be distracting fo anything rhythmic (okay for pad stuff or other slow swelling sounds)


Assuming my audio interface is telling me the truth about roundtrip latency, for me anything above 8ms starts to be detectable when playing guitar. Between 8-15ms or so it's subtle and you can kind of trick your brain into ignoring it but it's definitely still there if you pay attention.

And from what I've seen in the best case scenario a usb audio interface is going to have a roundtrip latency of 4-5ms (thunderbolt can be much better, there are some that claim 1 or 1.5 ms).

So IMO that means ideally wireless headphones need to stay < 3-5ms to be an unnoticeable replacement for wired ones.


Lhdc ll has 30msec latency.It's included in Android 10.

Maybe this would be ok?


Latency is only an issue if it is not accounted for. I am not sure if this is the case for video on android


I think the first gen AirPod was 200ms, and current AirPod Pro had it down to ~150ms. Which is already the outliner in the industry as most are still 250ms+.

Compared to Wired Headphone of 5ms latency.

At least Apple continues the R&D to lower latency. Hopefully someday it will reach sub 50ms.


I don't think wired headphones really have latency of their own. A single audio sample takes less than a microsecond. The latency will all come from upstream buffering, which will depend on the application and the OS.


Yup, not sure where the OP got the 5mSec number; perhaps it's about the soundcard buffering but then it's probably sort of high end using ASIO drivers and a rather small buffersize (say 192 samples at 44.1kHz) because I duoubt your everyday buit-in soundcard achieves that. But it cannot be about the speed of the analog signal from the DAC until it reaches your speaker: too lazy too look it up but worst case IIRC is still half the speed of light so not even near mSec range.


5 milliseconds is about what you'll measure for latency if you measure a wired headset, but yes, you're really measuring your sound card.


c ≈ 1ns/ft


Just to be pedant, c is the speed of a photon in a vacuum. Electrons are much heavier and are traveling in a medium, copper.

[0] https://en.wikipedia.org/wiki/Drift_velocity


The signal speed has nothing to do with the electrons’ drift velocity. It is a bit slower than c, but negligibly so.


Tyvm, this is super handy! More precisely, ~1.01670 ns/ft (~0.983571 ft/ns).


1ns = 300mm (or 30cm)

1us = 300m

1ms = 300km


Does the phone adjust for this when playing videos? Because I find even 50ms desync between audio and video to be very jarring.


Yes, phones delay video playback so it matches the reported latency of the audio device. This works via networked audio protocols as well, at least on Apple devices in my experience.


1 frame at 30fps lasts 33.33 ms. I work as a freelance rerecording mixer and can assure you that a shift of one frame can definitly be noticible.

However there are some people who are more sensitive to this than others.

50ms would be two frames at 25fps, which is already quite much indeed. Anything beyond that is just a crime against anybody who worked hard to sync sound and image.


This can be easily seen in action when playing audio over AirPlay where the latency is ~2 seconds. When you press Play the video waits for a moment to compensate for the audio delay and only starts after the two-second delay to sync with audio.


The popular type is from Qualcomm called AptX Low Latency (not the same as 'lossless' AptX HD versions) and you need it on both sides xmt/receive. I have never seen this supported natively before so I got them as a pair of headphone jack to BT dongle adapters. Video works fine on Hotel TVs which was my normal useage.

A normal old school ISM band headphone with transmitter base is better for TV but the dongles travel better.


That is a very good point. I have no idea as I dont use it to watch video. Would not be surprised to see if that was the case.


No need to wait for apple. Low latency was a thing many years ago... Where we are talking about ~40ms over bluetooth.


40ms is not low latency. It's good for Bluetooth, but it's not low. Low latency, as usable for rhythm games or music, is more like 10ms.


My TV already has higher latency than that and I manage on Rock Band just fine - anything that depends on A/V synchronization is generally aware of latency being a problem and has knobs for the users to adjust for it.

That being said, waiting multiple frames for video playback to start while audio buffers fill and transmit is a shitty experience - so it’s always good to reduce it.


What devices support that/


Not sure what the best way to look for it is, but searching for devices that support aptx low latency might be a decent indication.

https://www.aptx.com/product-listing?product_category=7&aptx...


I completely agree with you .. The latency in wireless headphones is bad .. It is clearly noticeable in games and even in general music playing sometimes .. I cant imagine people playing serious games on mobile devices without headphone jack.


I simply can't imagine people playing serious games on mobile devices period.


I got tired of having ads constantly shoved in my face for games I already paid for that then go free. That's when I realised games on mobile were never going to work for me.


Handheld consoles have lots of serious games. You just need to be on a platform that values serious gaming, not 5 min play sessions.


I simply can't imagine serious games.


i feel ya.

people with a desktop computer are a minority thou.


The only advantage is that when I'm running there isn't an annoying cable to fuck with. There is no other benefit that I can see. But I started running more to justify the purchase. Ahahaa.


Same here, but now when I run past other people using wireless devices the sound breaks up. Still probably better than a cable swinging around.


Same. I only got bluetooth earbuds for going to the gym. I specifically got ones that are sweat and water proof because otherwise I wouldn't use them. Cabled earbuds are far superior in every other way.


I’m waiting for some airpods to be delivered today so this is painful to read. I don’t mind a few ms of latency but are we talking noticeably bad latency? The only experience I have is with BT pairing with my car for music. I don’t even use voice in the car.


For playing music or even calls you won’t notice any difference.

The problem is trying to use and app like GarageBand where you need instant response to instrument control. In fact, first time you open GarageBand and it detects you’re using Bluetooth it’ll give you that warning.


> For playing music

What do you mean by "playing"? I play the digital piano with wired headphones, and with bluetooth sound it is absolutely unplayable due to the latency.


Playing as in pressing play and listening. He was separately talking about interactive playing with the GarageBand example.


Thanks. Just arrived and saw that. It’s not that bad though. I’m not using them for GarageBand though. Phew :)


Watching videos and listening to music is fine, devices usually delay video playback to match the latency so you don't notice any difference. The latency problem really only matters for when you need instant feedback for your actions: playing games, editing audio etc. Even then you get used to it.


In my experience it varies a lot. I don't have airpods but I have some other wireless earbuds without any noticeable latency. Later I bought a different model from the same manufacturer (Anker) and they are useless for video of any kind. I also have a pair of Sony wireless headphones that have no latency I can discern.


I only notice it on my AirPods when I pause what I’m listening to. I have “double tap right pod” as toggle play/pause. “Tap tap” and wait, often I’ll “tap tap” again just as it pauses only then to have caused it to play again. I have to trust that my initial tap tap registered.


I’m definitely 9/10 for wires rather than against.

But on the minus side, I have a chirp in my headset (obviously some interference from the motherboard) when connecting with a wire that isn’t there over Bluetooth.

Sometimes wires fail too.


Many people dont realize, but WiFi chipset is very important. Get cheap/free stuff from your ISP, and it will give you trouble (especially under load/multiple clients). Buy expensive gear with Broadcomm/Qualcomm hardware in it, and you will be better off. ASUS with rt-merlin firmware is a solid choice.

Also, placement of router is very important. Most people have their wifi in the basement, where their cable/phone line comes in, spending some time, and running a physical wire somewhere to the center of the living space, setting up router on a shelf or a table will go a long way in propagation. I have mine setup in the center of my bungalow, and I have 5Ghz WiFI reaching from my driveway to the backyard, no issues, and never any hickups.

Also, if you pick a high end router (Like ASUS 86U with merlin) -> you get some proper Wifi diagnostic tools built in, like number of packet retransmits, bad FCS, etc, which will help you diagnose your issues.

Multiple people on the same band is not as big of an issue as many claim it is. 5Ghz (no B/G legacy frames), and 80MHz bandwidth ensures that most traffic is very short in actual airtime. Not perfect, but, it is orders of magnitude better than 2.4Ghz, despite there being really only 2 80MHz channels available. There is also, channel 165, which is 20Mhz only, but rarely occupied.


FWIW, Xfinity provided me an "xFi" modem/router combo that performed great. I replaced it with a standalone modem + a Unifi Dream Machine from Ubiquiti, and my wifi speeds are significantly slower now. Still worth it to regain control of my network, though.


That makes no sense. That device is almost bleeding edge and should be just as fast if not faster than your router. Only thing I can think of is the ethernet port out of the modem isn't gigabit and that's why it's slower.


I get full gigabit when wired in, it’s definitely the WiFi that is the bottleneck.

With the Dream Machine built-in AP I get 400-500mbps down, compared to the xFi which got nearly full gigabit speeds. This is with direct line of sight to the access point, maybe 8 feet away.


Similar situation here. The AT&T-provided router for gigabit fiber performs significantly better than my new Ubiquiti "frisbee" access point that's mounted on the ceiling.

At a previous house, I was equally impressed with the WiFi speed and propagation with the stock AT&T router (a different manufacturer), which performed at nearly the theoretical max.

Perhaps ISPs are learning to provide higher end equipment? Of course, gigabit fiber already necessitates better than average equipment, lest you expect complaints from customers.


> ASUS with rt-merlin firmware is a solid choice.

Just so long as you don't mind giving Trend Micro access to all your information.


rt-merlin allows you explicitly to turn the TrendMicro stuff off, TrendMicro is only useful for QoS, and if you have a fat Internet pipe, you probably dont need it.

ASUS-merlin has DNS-over-TLS support, which, combined with NextDNS, will help you block trendmicro off if you really dont trust it.


Usually the easiest solution to solve most Wifi problems is having multiple access points.


I added a second one, and things got substantially worse. Devices would stick to the faraway access point because they could still just see it.

I tried things like "roaming assistant", to forcibly disconnect clients below a certain signal strength, but that interrupted ongoing WiFi phone calls and such.

It's possible that 802.11k and r help, but at the time I investigated, many clients didn't handle that properly.

Instead, I mounted one access point on the ceiling in the middle of my apartment, and WiFi is much better.


I agree, wireless is overrated in “desktop” situations.

One solution that I hope more monitor manufacturers would follow is the recent Dell “hub” monitors with built-in Ethernet (RJ-45), USB-C, downstream power, USB-A, etc.

https://www.dell.com/lt/business/p/dell-u2421he-monitor/pd

Only one cable to connect.


I'd take a simple monitor and an external hub, in place of a monitor+hub integration, any day of the week. We put two things one inside the other and call it progress, but that's less maintainable, sometimes less portable, more expensive, and, most importantly, less modular.


More wires more mess more ugly, not everyone has the same priorities.


Attach the hub to your monitor's VESA mount points and you gain a lot of long-term flexibility at the cost of a single short wire that'll be hidden behind your monitor. Otherwise functionally identical to an integrated hub/monitor, but you can upgrade or replace the individual components on their own schedule.

A good monitor can last through generations of computer hardware, it doesn't make sense to combine directly with other components that have much shorter lifespans.


Isn't it the same number of wires, since you still need to run a usb cable from the computer to the monitor? So then really all you get is not needing a separate box for the hub. That's admittedly nice -- I have a Dell monitor with usb ports and I use it, mostly because I'm not going to buy a standalone hub when I don't need it -- but if the monitor had a way to mount the external hub on the back, I think that would be the best of both worlds.


One cable rather than two. Some monitors have integrated USB hubs, but you need to connect a separate USB-A to USB-B cable, in addition to HDMI/DVI/VGA. Newer, high-end monitors will transmit video and USB over a single USB-C cable, as well as power in the other direction (to charge a laptop).


Interesting. It basically turns your monitor into a docking station. I could imagine adding an eGPU (if that doesn't already exist), too.

Personally I prefer modular components that I can swap out and upgrade separately, but I can see the appeal.


USB and video can share the same cable.


Apple has been doing this since the advent of Thunderbolt, and I have to believe it was one of the reasons they wanted it and then pushed toward USB-C.

With MiniDP and MagSafe it was nice but clunky; with USB-C and USB PD it’s great. I know some people miss the old clicky docks, but I agree with you that the monitor + single cable is a really good solution.


Honestly for a laptop I'd prefer the following setup:

An E-GPU enclosure, which is also the power supply, has an ethernet port, and provides some USB ports for any devices you always leave at the desk.

You connect it to the laptop with a single thunderbolt power-delivery type-c cable.

The monitor(s) are connected to the enclosure with type-c displayport with power delivery.

Thus you have only a max of two cables connected to the wall: power cable for E-GPU dock and ethernet. This is fairly similar to that monitor concept except that the monitors docking station is broken out as a separate component.


I wish something like HDBaseT would catch on.


In an ideal world, with plenty of funding, I've always wanted to take that a step further and implement a remote virtualized workstation. With just 2.5Gbps from NBase-T and 802.3bt for power and VESA display stream compression you could hypothetically power a monitor from a single cat5e cable as well as output full 1080p 60Hz video ("visually lossless"), supply power and data for USB, maybe even a passthrough 802.3af ethernet port for an ip phone.

You could take that even further by adding in a custom SPICE-like display protocol and only send rectangular regions of changed pixels and translation commands to shift a region of pixels around (scrolling, moving a window, etc). Given it's all virtualized instead of traditional ip phones you could go with dumb USB desk phones and route that USB device to a single PBX and build something akin to an old school PBX where the phones are more or less just dumb terminals. Rather than dealing with the headaches of setting up BLFs, dial plans on the phone vs. PBX, call forwarding and DND on PBX and phones, VM access, Hotdesking setups, etc it's all just a bunch of specialized HIDs hooked up to a VM. Need HA? Move the USB devices on failover to a separate VM.

It'd have all the benefits of thin clients with significantly less drawbacks. Need dual monitors? No need to replace your bargain basement thin client with one that supports dual monitors, just pull up another monitor, hookup to the same seat on the servers and done. You could sell a deluxe version with dual inputs so that each display is connected to two independent switches so if one bites the dust it doesn't take down 48 employees at a time.


For anyone else wondering https://en.wikipedia.org/wiki/HDBaseT

> HDBaseT, promoted and advanced by the HDBaseT Alliance, is a consumer electronic (CE) and commercial connectivity standard for transmission of uncompressed high-definition video (HD), audio, power, home networking, Ethernet, USB, and some control signals, over a common category cable (Cat5e or above) using the same 8P8C modular connectors used by Ethernet.


I'm not convinced for yet another standard. Why not encapsulate in Ethernet?


From what I have been able to gather its 10G Ethernet optimized for unshielded cat5e and a slightly higher line rate.


Sure it sounds neat, but it also sounds a lot like that xkcd about standards.

Also, the 8P8C connector is just godawful. I mean, it's fine for the first 3 days until the clip breaks off.

https://xkcd.com/927/


8P8C has the huge advantage of being pretty easy to terminate with some basic, cheap hand tools. I’d hate to try that with a USB-C connector.


That's true, with the caveat that cheap crimptools are often terrible, and good ones are surprisingly expensive.


Eh, my “cheap” TRENDNet stip+crimp tool does the job just fine. I’m not a heavy user, but I’ve done well over 50 terminations with it and every time I’ve had an issue it’s due to myself doing a shitty job inserting the wires into the connector.


HDBaseT has somewhat consolidated the HDMI over twisted pair market, so you do have one more standard, but it replaces a dozen non-standards in the field.


>it's fine for the first 3 days until the clip breaks off.

Hasn't this been largely fixed by connectors that have a cover over the exposed clip?


That sounds really, really close to USB-C.


Its always struck me as strange that mb's targeted at HDET have wifi at all - better DAC's for audio and 10g networking make more sense.


Did this for years with my ThunderBolt display, and have been doing it since 2015 with my LG 4K display.

The only bad thing is that because the LG is USB-C and not TB3, I only get USB 2.0 speeds out of it. I have a pair of USB-C to USB-A adapters that let me connect ethernet and my phone or other device to charge.


How does it transfer all these signals to the computer? Via DP?


DP over USB-C.

I currently use a Dell WD15 with my Dell laptop and I love it. Only single cable to plug in which drives my wired lan, 2 screens, power, mouse and keyboard.

This monitor with it essentially built in looks great, would remove all the following from my desk: - the dock - the power brick for the dock - One of the DP cables - one of the screen USB cables


Yeah it sounds like an ideal setup for laptop!


Via USB-C.


I've advocated for this for years to anyone who will listen, but it's like preaching in the desert.

It doesn't help that many people are on laptops, and the war on ports (in general, and ethernet ports in particular) has made wired options exceptionally inconvenient. Not only you have to go out of your way to buy extra hubs and adapters that are not included with the laptop, you then have to carry a bag full of dongles everywhere.


I only miss is 8P8C port, but only when I'm traveling and want wired connection.

USB C made my life easier. Now I only have to plug one cable into my laptop from my monitor and have everything(wired network, external display, speaker, webcam, microphone, keyboard and mouse).

I don't see a reason to carry a dock with me. A lot of people carrying a Micro USB A to USB A cable and a dock instead buying a Micro USB A to USB C cable which makes no sense


Honestly, I hate USB-C. It's flimsy as all get-out. All the USB-C stuff I have seems to maintain a poor connection when moved, and it doesn't feel anywhere near as sturdy as a plain-old USB A. When I'm using a laptop, I want something I can leave plugged in when I pick up and move the computer and not lose connection when jiggled. They should have made something that valued a sturdy, stable connection over slimness.


You should feel a distinct click when you plug in your connector. I never had any problem with my laptop.

What you described can be caused by lint inside your USB-C port. I had this problem only with my phone. Try to get it out with a toothpick.


I appreciate the help, though I've looked inside my ports before and found neither dust nor lint. The problem seems to be mostly with cheaper stuff. I've mentioned this to USB-C enthusiasts before and gotten back, "Well, just by better-quality stuff." The difference is, I never had any problem with the connection on cheap USB-A devices. If the connector seemingly can't be made both cheaply and well, that is a huge flaw with the standard and means it ought to be abandoned wholesale.


Well, a bigger connector likely wouldn't work for phones then, would it?

As far as I can tell, phones are the #1 thing people are plugging in and out these days.


No, it wouldn't, but I honestly couldn't care less about phones. It makes absolutely no sense to attempt to design a port that is good for both delivering power to tiny devices and also good for transferring forty gigabits per second of data, all while being durable and maintaining a good connection.


Well, the whole point of the movement was to make one connector be the standard for most everything. The general populace is much more upset with having to keep a bunch of different connectors around. It's much easier for them if they can just have one connector to deal with.

It'd be nice for me too. I wouldn't have to keep an entire drawer full of various cables...


Apple should provide a free dongle purse with a purchase of a new laptop.


Geeks: We want Apple to stop using proprietary ports and standards.

Apple: okay here is a laptop that has standard USB-C ports.

Geeks: we have to use dongles to support the standard.


False dichotomy. They could have multiple ports supporting multiple standards. This dismisses a valid complaint, in my opinion, the war on ports in the name of thinness. I've never heard anyone cite thinness as the reason they bought a device (weight, sometimes, but never thinness.)


So how many “standards” should they implement when usb-c is versatile? Should they include VGA ports like some corporate laptops?


I genuinely don't understand what the author is trying to say. It seems like their thesis is this:

> Wifi (and bluetooth, etc.) sucker you in by making it seem like they “just work.” But if you investigate, you’ll often find that the wireless link is operating in a degraded state that performs much worse than a wired equivalent.

Wireless technologies are founded on exchanging the convenience of wirelessness for reduced performance and reliability. There's no "gotcha." That's the whole idea. Maybe that trade isn't for you, but it's not a secret. If you take any course on physical networks, they go over the advantages of not sharing the medium.

As for the hellishly complicated tech stack that is bluetooth...sure. I don't think it's because bluetooth is wireless. We've had regular articles here on the total mess that is UBC-C[1]. Wired standards are not free from complex protocols and error-filled implementations. All things being equal I imagine that wireless protocols will tend to be worse - but all things are frequently not equal.

There are no free lunches and wireless technology trades performance for convenience. I myself have some wired devices and some wireless devices and am quite happy with the result.

Edit: This feels like saying "Phones sucker into making you think they're a computer. But if you investigate, you'll often find that they have less processing power than many other computers!"

[1] From 15 days ago: https://news.ycombinator.com/item?id=23435805


>There's no "gotcha." That's the whole idea.

Strange. I have closely inspected the box my wifi router came in, and I so far have failed to find "This is worse than wired ethernet" printed anywhere on it.

Perhaps you could show me a single wifi AP, anywhere, where the ad copy says anything even remotely close to that.


I think most routers promote how their wireless technology is different than older models (since that's what might cause users to buy a new router).

But, for example, the Netgear Nighthawk[1] page spends a lot of ink advertising that it offers 1300+600 mb networking (whatever that means) and also notes that it offers 5 1000mb ethernet ports. Even without an understanding of the details of a common medium, 5000mb wired capacity is greater than 1900mb wireless.

[1]https://www.netgear.com/home/products/networking/wifi-router...


It advertises that it's faster, not that it's more reliable. A reasonable customer might extend their experience with other objects and conclude the faster mode is less reliable. A F1 car destroys its tires in hours, while wheels on a commuter car last for years, an overclocked processor is less stable than one at stock frequency, supersonic jet fighters kill their passengers at a rate thousands of times higher than subsonic commercial airflight, etc.

Understanding that ethernet is entirely different from wifi requires... domain expertise! The exact thing a Best Buy customer won't have.

>5000mb wired capacity

A suspicious man might note that switching capacity isn't stated on that page, is not mentioned at all in the manual, and is the exact thing a manufacturer would be tempted to cut corners on, since vanishingly few customers will attempt multiple simultaneous network transfers in a home LAN. A normal consumer would never notice a switch capacity higher than 1 gigabit.


> Wireless technologies are founded on exchanging the convenience of wirelessness for reduced performance and reliability. There's no "gotcha." That's the whole idea. Maybe that trade isn't for you, but it's not a secret. If you take any course on physical networks, they go over the advantages of not sharing the medium. [Emphasis added]

I mean, yes, if you think the OP is addressing the article to people who have taken courses on networking, then maybe the article isn't saying anything interesting.

If you read it as trying to make people who don't know this more aware of the tradeoff they're making, it makes a lot of sense.

My guess is that very few people understand how much better wired is over wireless, or in which situations (I know that I don't always think of it, and I have lots of experience in networking).


There's also the reality that USB isn't really that much less of a mess than Bluetooth. With Bluetooth you have interoperability problems between the controller and the target device. With USB, you have that + the interoperability of the cable.. and it's a mess. USB does overall work better than Bluetooth, but the problems he's highlighting in the article very much exist in the wired space (including the device polling screwing up latency).


>Wired standards are not free from complex protocols and error-filled implementations.

Sure, but once you figure it out for the set of devices you are using, that's it! It just works until it breaks for good. Wireless stuff... not so much. One day it works, the next it doesn't.


For certain devices like headsets, even the wired ones I find randomly stop working, I am sure it is not purely random but typically due to opening up X, Y, and Z programs In a certain order. Issue resolved with restarting but that can be pretty disruptive in the middle of a work day.


> It just works until it breaks for good.

...or until you move furniture around and then need a connection over here instead of over there.


His beef is with broken network and BT stacks, I suspect. Not with the actual underlying standards.


Wireless is OK, you just need to use it correctly.

The trick is to treat it as a cellular system, and use the same methodology used to build cellular networks:

(1) Use only 5GHz for Wi-Fi. Turn off 2.4GHz WiFi - it's an overloaded, noisy spectrum band.

(2) One router per room, alternating channels between rooms.

(3) Use MU-MIMO (802.11ac gen2) or 802.11ax. The efficient MTU for Wi-Fi is much much larger than the MTU used for real time communication. MU-MIMO lets you send multiple small packets destined for multiple users inside one big wifi frame.

(4) Don't use mesh devices. They multiply the "airtime" of each packet by x2-x4 depending on number of hops and number of packet retries.

(5) Leave 2.4GHz for Bluetooth, and "junk" ISM devices (microwave oven, very-low bandwidth devices, etc).

People push wireless to the breaking point, then complain it's broken...


>(1) Use only 5GHz for Wi-Fi. Turn off 2.4GHz WiFi - it's an overloaded, noisy spectrum band.

This is an extremely bad idea in some countries, or if you’re not careful which channel you select. In nearly all countries some of the 5ghz channels are shared with RADAR and require DFS. In some countries (eg Germany) all of the channels do.

The idea behind DFS is to detect radar pulses and then shut down the network for a period of time, this can be triggered by radar or interference.


Radar/DFS is a whole other can of worms.

My rules of thumb for DFS:

(1) If you can use a DFS channel - use it. it's usually less congested than the regular channels

(2) If you're unlucky and DFS is triggered once per month or more, try repositioning the AP or changing the AP model to one less sensitive to radar pulses

(3) If no luck - use one of the non-DFS channels. Every country has those. In Germany, that would be channels 32-48 or 149+

Note that DFS (5.4GHz) is for weather radar. In some countries it's operated only in winter season. You may want to use different channels in summer and in winter :)


I ended up switching to 2.4GHz due to interference and have had substantially better performance since. 5GHz is overloaded in my apartment block.


"Ok" is worse than great (which is what you get with wired setups, all the time every time).

I understand your rationale, and I also like the convenience of wifi when it works. That being said, if you can wire "that thing" (whatever it is), just do it. One afternoon of wiring and it will work great for the rest of time. The trade-off is definitely worth it in my experience.


Well that was a long "just". Installing a few sockets and a switch ist much less effort.


> (2) One router per room, alternating channels between rooms.

That isn't realistic for most people, though.


Use power line then - and you probably wont need a router in every room


In my experience, it's not reliable enough. Sometimes it's great, but sometimes it stops working for a few solid seconds. I think it's when the AC starts up.


Anecdata, but my experience with powerline networking in a house with relatively new wiring was that wifi performs better and more reliably.


All good advice, but you forgot to account for the apartment building situation. Your neighbours could be screwing it up for everyone else.


Thus recommendation #1 - Use 5GHz only.

5GHz propagation across walls is much worse than 2.4GHz propagation, therefore only nearby neighbors may interfere with your access point, and on 5GHz there are enough non-overlapping channels to choose an alternate channel avoiding this interference completely.


WiFi analyzer shows about 26 access points on 5 GHz that aren't mine.

I agree it should be much better than 2.4 GHz -- only 2 of my neighbors are stronger than -70 dBm -- but more channels would be nice.

(I guess 6 GHz could help, but then I'll have to upgrade all my clients, too.)


There really aren't when you have 2x80Mhz on either side of you leaving only DFS which can be tricky since a lot of older consumer routers didn't support it. WiFi 6E should help as the allocation is massive with less propagation.


160MHz channels are a rarity even today, neither the latest Samsung S10 nor the latest iPhone support 160MHz.

Much more common is 80MHz channel, and even then, there is a distinction between the "main" 40MHz and the "secondary" 40MHz sub-channel. If you monitor the packets over the air, you'll notice that the secondary channel is much less occupied - all mobile devices switch to 40MHz to save considerable power, unless downloading a big file.

In the US, there are 6 available 80MHz channels, which is more than needed (think 4-color graph coloring).


All of it is true, but before going multiple APs, you should make sure you really need it - you would need to manage multiple networks and getting the APs to work together and allow devices to roam seamlessly might be a pain.

This is where the mesh units might come in handy - they are managed and configured in a single place, and can work together nicely - but of course you have to set them up with wire backhaul instead of in the "mesh mode".


I agree. "Mesh node" with wire backhaul is good and usually solves the seamless roaming issue (YMMV, check before you buy).


That isn't mesh that's just how wifi is designed to work, with a wired DS (Distribution System) Mesh uses radio for the DS


> (1) Use only 5GHz for Wi-Fi. Turn off 2.4GHz WiFi - it's an overloaded, noisy spectrum band.

Even recent low-end phones sometimes have just 2.4GHz Wi-Fi. Even if all you devices support 5 GHz, running 5GHz-only wi-Fi may cause problems to your guests.


That would be great if you can afford so many access points, all your clients support 5 GHz, and you are able to run Ethernet to the access point in each room.


Many people complacently or willingly use wireless technology for fixed-location devices. While I can appreciate that running Ethernet cable can be a bit frustrating, seeing a desktop computer using wifi makes my head hurt.

Especially in the current work from home era, I've seen a resurgent enthusiasm for desktop computing. The simple rationale is: if you're going to be stuck at home and have a decent home work environment, why suffer the downsides of portable devices? Why not leverage the upsides that come from fixed-location workstations (large displays, faster higher-wattage CPUs and GPUs, wired full-size input devices, wired networking, and so on).

It's been amusing to see how many people have had their eyes re-opened to the advantages of high-end desktop computing thanks to environmental circumstances.


WiFi is faster than most people's WAN connection so it's fine.

High wattage machines make rooms uncomfortably hot and expensive to cool.


Fast WAN connections are becoming more common. Also while WiFi has decent bandwidth, the latency/jitter/loss are still ass.


I enjoy the convenience of wireless. But I do make a conscious effort to put devices that are going to stay in one location on Ethernet if possible. I even specifically bought a certain Roku model mainly because it had Ethernet.

So basically the only things on WiFi are laptops, phones, and devices which don't support Ethernet. For the most part I don't notice issues with my WiFi. I do use the Ubiquiti Unifi access points for providing WiFi in the house.


Yeah, I set the computer/gaming room based on where I had the easiest wire networking access in my house. Everything else spreads out from there. Wireless is fine for your phone or tablet, but I've tried to get wired network runs out to where the TVs are and the like.

I also prefer the security barrier wiring provides: My wireless network is a completely separate firewall zone than my wired one.


I know people who have to have everything wireless. Little do they realize that all that extra radiation is probably slowly killing them. On top of that is the security risk of having personal data leaking out of your room to whomever decides to eavesdrop on the signal (A threat model which becomes clearer when you see how easy it is to collect signal leak)


> Little do they realize that all that extra radiation is probably slowly killing them

Haha, honestly people could probably use more radiation, i.e. sustained daily sunshine, to live healthier.


The sun beams down 1 KILOwatt of light (50% infrared, 40% visible, 10% other) to every square meter on the ground. Most of the infrared and much of the visible light is absorbed by your skin. Your WiFi router on the other hand has a power of 50-150 MILLIwatts, and won't let you set it higher because that's illegal. And your phone etc. run at around 15 milliwatts.

I guess if you built a ridiculously high powered WiFi antenna (which would be illegal) and stood next to it for a while, you would cook yourself. But BT/WiFi ain't gonna do shit to you.

https://m.youtube.com/watch?v=i4pxw4tYeCU


Standing outside in the sun all day everyday is absolutely deadly.


That's somewhat beside the point though, isn't it? Wifi and Bluetooth are not giving off anywhere near that much radiation.


If that were the case, there would be no African tribes left alive.


Clothing, night-time, and shade all make their exposure far less than 24/7.

Also higher melanin skin blocks?/absorbs? radiation.


Well, things that are deadly don't necessarily kill everyone.


By that rationale, walking is deadly. Breathing is deadly. Sleeping is deadly. This is not a valid argument supporting the stance that wifi is deadly.


Sorry, I didn't mean to argue that wifi is deadly (because power is so much lower), I'm merely agreeing with the parent that the sun is deadly. I think the links to cancer are strong enough to warrant the term.


The wireless mouse has been the exception to this rule for me so far: the G Pro wireless (and some other mice) don’t have the latency issues, and have egregious battery life (and the charging cable fails over so it’s basically just a wired mouse).

The lack of a cable saves a bunch of space and avoids the floppy cable. Probably irrelevant if you don’t game, but definitely quite convenient for FPSes.


I don’t think I’ve ever heard egregious used with a positive connotation. It looked out of place at first, but makes perfect sense etymologically. Thanks!

https://www.etymonline.com/word/egregious


I actually can't decipher strstr's post. Is it in favor or against wireless mice? Is the battery life good or bad? Are they using the mouse plugged in to the charging cable constantly or not? I'm legitimately baffled and not trying to be a grammar police.


They use egregious in the "extraordinary" sense, meaning the battery life's great. Once it does need charging, it can still be used via the cable during charging.

I have to agree, specifically mice have been the least painful Bluetooth experience for me, with a Microsoft. Annoying part is that it changes MAC address when pairing, so regularly using it for more than one computer is a frustration but that's nothing compared to the constant interruptions I've had with keyboards and codec issues with headsets.


I actually quite enjoy having a wireless mouse even though I don't game (nothing more than OpenTTD or Heroes of Might and Magic).

Like you, I have a "gaming" mouse (G700s), in my case I bought it for the high resolution. I used to have g9x, but it developed an unbearable whine. Battery life is downright atrocious, I basically have to recharge it everyday. It's also comically heavy.

I often have to move things around my desk, and having papers getting caught up in the wires is aggravating. Also, apart from gaming mice with very supple cables, I always found wired mice to be a pain because of the wire that's never quite in the right position and pulling and / or getting caught on things.

The same happens with keyboards, but to a lesser degree. I often use a trackball, which is like a keyboard in that it doesn't move around and I still prefer having those wireless.


I just switched from wireless mouse. With high dpi and illumination, battery lasted single day. Wired mouse is much lighter.


I run my G603 on a single AAA battery with an AAA to AA adapter. So far I've been exclusively using "dead" batteries from other devices – each lasts a couple weeks! (In low Hz mode at least.. I don't feel any benefit from 1000Hz mode, even in Counter-Strike.)


Really depends on the on the mouse implementation: g pro is only 80g. It’s hard to get a wired mouse much lighter without a honeycomb shell (mice in that style can push into the 50s).


One thing to remember is that the g pro wireless is a standout among many “okay” implementations


I've been using G602 for more than 5 years now and performs like a wired mouse and it doesn't eat battery.

I will never use a bluetooth headset, the technology is not there yet. Latency and battery usage of bluetooth audio devices are inconvenient and I don't understand people can bear with it.


The GPW also has a wireless charging mousepad available.


Hotels have universally switched to wifi, but I miss having an Ethernet jack. I think that at least some hotel Internet problems are actually wifi problems and not due to the crappy service provider they use.


A lot of hotels still have ethernet, but while the wifi might be free, most often than not, ethernet is not. In many cases some form of tunnelling allows to connect anyway (e.g. IP over DNS)


There is two changes that had a surprisingly large impact on the perceived flow and effectiveness of my daily work:

1. Ditching notebooks for fast desktops

2. Replacing wireless connectivity (esp. networking) wherever possible.


I’ve been doing the same. I’ve recently replaced the following:

* Macbook wifi -> wired Ethernet

* Apple Bluetooth trackpad -> wired mouse

* Apple Bluetooth keyboard -> wired

* Bluetooth / Logitech 2.4Ghz mouse -> wired

The one problem this hasn’t fixed is keyboard connection to the MacBook. I still have to unplug the USB cable from the (Anker) hub and plug it back in again whenever the MacBook has been asleep. It’s better than forcing a re-pairing of Apple’s own keyboards (yes I’ve had two) with their laptop every day or two though.


Why would you get rid of the Logitech 2.4ghz mouse? These are great (well, the gaming mice), absolutely no difference in performance vs wired, no pairing required..


I'd sometimes have to remove the dongle and re-insert it for it to be recognised (again) by the OS. When I used it via bluetooth it would keep failing to connect and need manual deletion from the list of paired devices and re-adding.

Also its dongle seemed to interfere with my wireless Apple keyboard. Not sure but it seemed to correlate a few times.

It's much better with its own dongle, yes, but ... I got sick of the occasional issues. No issues with a wired mouse.


Experience might be different depending on OS. On macOS, my Logitech mouse would regularly forget its settings, forcing me to reload the profile I’d saved in the software.


My Magic Trackpad 2 has input lag the first time you touch it after more than, say, a second. Apple support said it might be interference and to turn off any other nearby Bluetooth devices. Oof. Ok, thanks.

Perhaps the next product I buy from them in an Apple Store I’ll pay for by wafting my card around the contactless terminal a little before walking out the store saying “if it didn’t go through, try turning off any nearby devices!”


You can use the Magic trackpad and Keyboard also with the cable. It won’t use Bluetooth then.


I use my Bose QC35 Bluetooth headphones with my MacBook to watch movies. The system has a comfort meter: when my head is in exactly the most comfortable spot on the couch, the Bluetooth signal drops out completely.

It also drops out when I touch my head or my face.

I have tried reorienting the laptop to no avail. No matter where the laptop is (or, in fact, any components of the system, including the couch between 1m and 2.5m distance from the laptop) it always finds the same comfort spot to drop out completely.

If I move my neck about 15 degrees to an uncomfortable angle, the signal is perfect.

An anecdote, yes, but a relevant one.


I had the same issue and resolved it by buying a USB extending cable and attaching an external Bluetooth transmitter closer to the ceiling. It's not pretty but it works.


I may try that. Thank-you.


I’ve found the Sennheiser RS series of wireless headphones much better than any Bluetooth headphones for connectivity for watching movies.


your body absorbs the EM radiation and shields the receiver from the transmitter.


Sure, but that does not strictly explain the phenomenon.

The headphones have clear line-of-sight to the laptop in both the "off" (comfortable) position and the "on" (infinite set of uncomfortable) positions.

The absorption of the radio signals by my large capacitance, I would think, doesn't change much with position. Also, it turns off when I scratch the back of my head.


So, poor design? I mean, it doesn't matter what's the technical explanation is, it doesn't work, right?


Can anyone recommend a bluetooth headset (i.e. speakers and mic) which does not have this issue, as mentioned in the article:

"Low quality. Related to the codec issue, many bluetooth devices will play high-quality audio when the microphone is turned off, but degrade to much lower-quality audio when it’s turned on. You can test this for yourself if you have a bluetooth headset: play music on it, then open your microphone settings to the page where it shows the mic input volume. You’ll probably hear the audio cut out for a second, then return at lower quality. (This happens even with devices you might expect to be high-end, like my Airpods Pro + 2018 Macbook Air.)"

I have need trying to find any, but have failed till now, and do not have unlimited budget to buy and try many options.


You need to look out for headsets that support the Hands Free Profile version 1.6 [0]. This version adds optional support for much better audio quality when using the microphone. I'm currently using the Sony WH-1000XM3. Unfortunately HFP 1.6 seems not yet to be supported on Linux (i.e. Bluez/Pulseaudio)

[0]https://en.wikipedia.org/wiki/List_of_Bluetooth_profiles#Han...


HFP 1.6 does not solve the problem. Surprisingly, there is no solution [1]. HFP 1.6 supports "wide band speech with the mSBC codec," but this sounds terrible compared to the CD quality sound you get from the unidirectional A2DP profile. Some OSes automatically toggle from A2DP to HSP during a phone call (eg. the "auto_switch" option of PulseAudio's module-bluetooth-policy), but you can't use both profiles at the same time.

[1] https://www.ipetitions.com/petition/duplex-high-quality-audi...

This is why the wireless ModMic [2] talks about Bluetooth codecs but actually requires a custom USB wireless receiver.

[2] https://antlionaudio.com/blogs/news/introducing-modmic-wirel...


Yeah. It's possible that Bluetooth LE Audio could improve this, we'll see once they finally get around to releasing the specs. It certainly sounds like it's designed to support bidirectional audio links without the stupid HSP/HFP divide of classic Bluetooth. (There's more mainstream demand for this feature now due to things like voice assistants.)


What about dedicated Bluetooth headsets for phones, like plantronics? Aren't they supposed to work bidirectional with high quality audio? Or do they also use the standard Bluetooth profiles/protocols?


Bluetooth headsets like the ones Plantronics make are designed for making phone calls, and the limitations of the standard Bluetooth headset protocol weren't a problem for that because standard phone lines were monoaural with the exact same frequency response and bit depth limits so you weren't losing anything extra. The trouble is that people want to use Bluetooth audio for things like gaming, video conferencing, using voice assistants whilst playing music etc which have much higher quality audio. Also, phone calls themselves are slowly improving too with things like HD Voice.


Hey, I spent a few hours trying to get those headphones working yesterday. Everytime I selected HFP on Ubuntu 20.04 the selection just went blank. Any tips greatly appreciated! Thanks.


I have those headphones too, and they work very well with Linux. Actually, they work even better than Windows since Linux can be convinced to send LDAC instead of aptX. I do that using pulseaudio-modules-bt [0].

If you're into listening to music, you may be interested in AutoEQ [1]. Those guys have done some measurements and provide info on how to equalize the sound. For this particular headset, it's somewhat less boomy. You can use PulseEffects [2] to control equalization.

[0] pa-modules-bt: https://github.com/EHfive/pulseaudio-modules-bt/wiki/Package...

[1] AutoEQ: https://github.com/jaakkopasanen/AutoEq

[2] PulseEffects: https://github.com/wwmm/pulseeffects


You probably won't find one because the Bluetooth standard doesn't allow it - it demands switch to low quality codecs for two-way communication. This hasn't really changed since BT 2.x.

Perhaps using something like Apple proprietary headphones or headsets with non-BT radio receivers would work.


This was my biggest disappointment with the AirPod pros. On my phone the sound and mic quality is good. If I try to use it on my MacBook Air on zoom it’s totally crap. Audio only is good-ish (I get some brief disconnects when there’s no sound some time, but the audio quality is good). With the mic it’s absolutely not.


You might be able to get around this by using the laptop microphone instead of the one on the AirPods. Option + click the volume icon in the menubar during a call and choose the MacBook microphone. Audio should switch to A2DP and sound much better.

This works for me on two different bluetooth headsets, but I've never tried AirPods so YMMV.


Yeah I use an external Yeti Nano mic which is awesome. But I think it’s not that much to ask from BT headphones that costed me 280eur to be able to have reasonable sound quality on my Mac... isn’t it? (Reasonable = as good as on my phone)


They can have that exact quality - just connect them via cable ;)


This is certainly better than using both laptop mic and speakers, but especially on MacBooks on which the fans are always going wild it can still be awful.


> Apple proprietary headphones or headsets with non-BT radio receivers would work.

That doesn't exist


Which part exactly? Apple has the W1 thing and there are headsets that use their own radio implementation - https://www.logitechg.com/en-ch/products/gaming-audio/g533-w...


I would consider myself lucky if I have a stable connection...

I have the latest Sennheiser Momentum Wireless and it disconnects itself many times from my Mac which is only less than 1 meter away...

I don't even really care about the degradation of sound quality now because the disconnection is so tormenting.

I was having an Okayish connection experience with Beats years ago, and I guess it's because it's been tested and may be optimized for Mac.

Rumor says Apple would release a headphone in the upcoming WWDC tomorrow. I hope I would eventually have a headphone with a stable connection...


The upcoming "Bluetooth Low Energy Audio" standard should improve call quality. I haven't seen any devices implementing it yet, though.

I'm not inclined to buy another pair of Bluetooth headphones with the microphone quality currently available.


I've vastly improved the quality of my wifi by upgrading my routers and my devices. The important thing to remember is that wifi always performs as the worst device on your network. A dead-zone device hurts every device on your network. A G-only device hurts every device on your network.

So, 2 routers in my sms house, hooked together wired. All my priority devices have new dual-band hardware and are on the 5ghz network. Old devices are relegated to the 2.4ghz network, which is quite spotty owing to that.

Unexpected things can do real damage too. I had an older PC that had its wifi dongle plugged into a USB 2 port and it would frequently get multi-second ping spikes. The same wifi dongle worked fine on a USB3 port. I ended up wiring that one in.

Personally I've abandoned all wifi hardware companies other than Asus - every other one has let me down at least once, if not repeatedly.

In every other respect, Asus seems to be incapable of making devices that rate higher than a B plus, but in the lowered expectations of the world of wifi hardware companies where "occasionally functional" seems to be considered good enough, Asus' spotty quality is a huge step up.


> A G-only device hurts every device on your network.

Doesn't need to be on your network, just in your vicinity like from a neighbor.


What is the G only device doing that degrades performance on a dual band wireless network?


Yea, that's a fact that most people miss. Every transmission on the same channel as your wifi network slows you down. Every time two devices on that channel transmit at the same time, they both have to retransmit. They both wait a random time before retransmitting, which is good, but then they do the retransmission at a lower speed too, occupying more channel time, in order to make it more likely that their transmission succeeds. This can mean that every device eventually gives up on high-speed transfers, and starts sending every packet at the lowest possible speed.


Buy a router that supports flashing https://openwrt.org/ . It saves a lot of time. After about 2 years of trial and error i figured why the network becomes bad and stops working when i run torrent, i never suspected the router would be the culprit and thought ISP had an issue, turns out ISP is very reliable most of the time and its the router that is having problems. The problem with router is the debug logs are a joke and does not tell anything. OpenWrt made significant change to my life. Thanks OpenWrt deveoplers.


His point about wifi network polling is spot on. I noticed this exact issue once all of my meetings moved to Zoom. It turns out that location services on my Mac frequently poll all of the nearby networks, causing tons of dropped packets which broke up the audio and video in Zoom. Once I figured this out, which took a while, I had to completely disable location services to resolve it.

Switching to a Unifi router and picking an empty DFS channel also improved my wifi experience significantly since I live in a NYC high rise. For those that don't know, many routers allow you to use a set of 5GHz "DFS" channels that are normally reserved for aircraft radar. By regulation, any router broadcasting on a DFS channel must be able to detect radar interference and switch to a non DFS channel immediately for a period of 30 minutes or so. If you live in an area that aircraft regularly hit with radar this be very disruptive, but if not, these channels often perform better than the dedicated 5GHz channels since most routers don't use DFS channels by default.


as a wireless hater (and especially a hater of designed-to-fail rechargeables), I would love it if the trend reversed. I don't want to have to manage charging devices. I want a keyboard that works, always, if it is plugged in.


I've got a couple of wireless devices, and all of them charge using either micro USB or USB C, and my Logitech G mouse is the only one with a custom receiver, all the rest are Bluetooth. It's great. I no longer snag cables to the point of breaking with my chair, and if I do, they're just USB cables and are replaceable.


Going from 2.4Ghz to 5Ghz was great for stability and speed... until spectrum started pushing 5Ghz as default and crowding the 5Ghz spectrum here. Then I hooked up cat5e and upgraded to gigabit service and it's been bulletproof on my desktop. I can't go back to wireless after wired.

I wonder if 5G is going to feel like Ethernet gigabit instantaneous levels of speed?


I love ethernet connections - you plug it in, it goes. Doesn't care what your neighbor is doing, and it's my understanding that there's less overhead in sending-receiving data over it.


Related: anybody have experience with powerline ethernet? Are there good adapters that can make high-quality/high-speed ethernet connections over power wiring inside a home? I know these are a thing but I have no idea how good their quality is or how strongly they depend on the wire itself.


In my first flat I had around 400mbit/s via some cheap powerline adapter, in my second flat I couldn't connect them at all...

You'll just have to try, it depends a lot on your home wiring and can be interfered by so many things. But if it works it's definitely nicer and more stable than WiFi.


I have the Extollo LANPlug 2000 and it has by far the best performance (rock solid pings of about 3ms in my house), but:

1. It works terribly if you use more than 2. There's just some bug in the L2 switching where it will get confused about the destination; broadcast packets get through, but some packets to specific hosts don't go through. Rebooting the adapter fixes it temporarily. I ended up moving one device to WiFi and as a pair it works reliably

2. The firmware it ships with is a bit limited, but they provided me with something more configurable when I contacted support about issue #1


I had the same experience with some TPLink adapters. mdns never worked reliably so I couldn't see some printers or speakers on the other end.


The more advanced firmware they sent me allows tuning what broadcast traffic is allowed through. Note that with either firmware, broadcast traffic worked fine, but it would just seem to get confused about where a specific MAC was.


Likewise. Running four tp-link gigabit (one is also wireless). Have to reset the whole lot every four days!


PowerLine uses very similar technology as WiFi does, except that power cables are significantly more noisy than air. As such, they usually perform significantly worse than wifi. With exception of certain cases like buildings with 19th century 2m thick walls.

In most other cases, quality WiFi equipment performed better and was more stable.


This has been my experience. I bought some Devolo powerline adapters a few years ago to try in my apartment, and got quite poor speed (<100Mbps) and network was rather unstable. Switched to WiFi and got better speed and stability.

Last year I moved to a new place and faced the same issue of wanting internet in a different room and tried the Devolos again. Same experience except worse speed. Again I had to fall back to WiFi, which at least gets me ~150Mbps here.


That depends on where you are. In a drywall or wooden house, this would be true, but in concrete apartment building with thick walls and many other networks, it might not be the case.


Sorry that's total bolocks as we say I have 4 1200+ devolos and they are as solid as a rock, and what does mains emissions have to do with this?

Powerline is much less faf than using wifi even more so for non networking types.


Powerline depends a lot on the state of your power wiring.

I have Devolo Magic2 G.hn powerline adapters. In my rental UK apartment (1990 or so vintage), I can barely get 100 Mbps for the 10 meters or so from my office to the living room. The same units with mesh-wifi sync at around 300 Mbps in my parents' much larger 1980 vintage house in France. Perhaps the difference is due to the UK's crackpot ring electrical standard.

I wish someone would make laser free-space optics for in-home use (they are available for company campus networks).

WiFi 6E adds a full GHz of bandwidth in the 6GHz band and should dramatically improve congestion, but you will still need multiple access points with some form of wired backhaul to get reliable WiFi.


I have the TP-Link AV2000 and found them good. My office has a bunch of metal laundry appliances and concrete walls between my desk and the wifi router, so wifi signal is terrible there. Solved it with the AV2000.

I also tried using a mesh wifi network, and was able to bounce a signal around the dead zone and get a WiFi signal with a higher bandwidth than the powerline ethernet, but I still found I got a lot of glitches in video calls, so went back to the powerline ethernet.

I've not noticed the problems other commenters here had with their adapters with the AV2000.

I initially tried a cheaper powerline ethernet, and it was worse than WiFi, so I'd recommend you buy the most expensive/best one that you can afford.


My desktop is connected through DLink AV1200 PLCs.

Every once in a while they lose connection (maybe once a month).

The speed is not great (~300Mbps) but I don't care because my inet connection is only 100Mbps. The latency is better than wifi, and really consistent (unlike wifi):

--- 192.168.1.1 ping statistics ---

18 packets transmitted, 18 received, 0% packet loss, time 48ms

rtt min/avg/max/mdev = 1.975/2.669/4.884/0.617 ms


I have Devolo dLAN powerline adapters and they deliver around 300mbps even though they are marketed as “AV1200” and have Gigabit Ethernet ports.

So yes there are adapters that are decent, but you need to aim for the fastest ones because real-world performance seems to be 1/10th of the advertised performance.


I had a pair of tp link pa8010 gigabit adapters for a wire running about 10m. I had pairing issues every day, regular "random packet loss", large latency spikes to my router, none of which I was seeing on a wifi device. It was better than nothing, but not something I'd recommend.


We tried it years ago. Performance was bad and it seemed like it's a source of a lot of electrosmog. We went back to cables after 1 week.


They can be electrically noisy and annoyingly interfere with ADSL.


And also stomp all over various HF radio bands. They're a nuisance.

Anything that converts long runs of unshielded cable into an incoherent broadband transmitting antenna should fail FCC certification, yet they're still on sale.


A run of ethernet cable costs money to install, but once done it uses no power, zero maintenance and gives you 1000Mbs or more full duplex reliably. (I'd discard Ethernet over Power before Wifi, cable is so much nicer).


All my walls are solid brick, so it's a choice of cabling tacked to skirting boards, cabling under the floors or excavations. I'd pick powerline over those options.


All my walls are solid concrete. I recently had the house cabled up because the powerline stuff dropped multiple times a day for 5 minutes at a time. It was infuriating.

The cabling runs through the ceiling and dropped down to port height via electrical conduit on the external side of the wall. It was expensive but I would have paid more considering the quality of life improvement.


I spent a couple of hundred on a mesh wifi system and while it has two downsides (double nat, and a higher base latency of around 6ms instead of <1ms to the router) it's otherwise flawless. I have a small flat (500 sq ft) and it took 3 carefully positioned mesh points (within LOS of each other) but it works great.


Not an option for rentals, sadly.


I've been using ethernet cables at home recently, on account of having a bunch of work stuff here. Ethernet = better, more consistent throughput.

So I've settled on using rubber cable floor covers. They don't look very good, but they're perfectly practical: cheap, much reduced trip risk, no need to open up the walls or tape/nail anything to them to get everything arranged.

(Not sure how I'd feel about this on a permanent basis, though. Once it's back to just me, my laptop, my iPad, and my phone, the internet connection will be the bottleneck again, and I'll be switching back to nothing but wifi...)


I take it you don't have a wife :-)


I have a wired headset and everything on my home office desk is wired except for my mouse (now mice, since I'm trying out a 3-device logitech M720) and keyboard.

(Incidentally, the "lightspeed" dongles Logitech uses do work at 2.4GHz, but with a proprietary protocol -- which I think someone has already gone to the trouble of documenting, since IIRC it uses only the frame sizes and types it really needs to.)

Headsets, mics, laptops, are all wired (you can get pretty decent USB-to-Ethernet adapters for cheap these days), and give me zero trouble, but I could never go back to wired keyboards and mice because of the freedom and flexibility you get (I'm trying to wean myself off the Apple keyboard/trackpad combo into a Logitech K380/M720 combo to be able to switch devices more easily).

Unless you're a gamer (and really need a wired mouse to shave off those milliseconds), there's really no contest in terms of input devices. Audio, that's another matter entirely (I only use Bluetooth on the go).

OTOH, I have an Airport Extreme base station (set to 5GHz) underneath my desk for those times when I do need wireless (like bringing in a laptop for checking things, or my iOS devices).


I kept jumping between wireless and wire because I'm annoyed by wires, and keep getting burned by wireless pitfalls through these years. But each round wireless seems to be slightly getting better over time.

Here are some tips I collected along the way:

1. 2.4GHz sucks, use Bluetooth when possible. But Bluetooth sucks, too.

2. You want to have fewer Bluetooth devices when possible, reduce the number of simultaneous connections. For example, I only use my Bluetooth mouse for gaming, and trackpad for the rest, so I would disconnect mouse when I'm not playing video games.

3. You want to use some categories of Bluetooth devices first. For example, Bluetooth keyboards usually have great experiences so you may want to make your keyboard wireless first. Bluetooth mice and trackpads are Okayish, but not as great. Bluetooth headphones and speakers are usually bad. If you really want to use Bluetooth for Audio, sometimes devices from the same company may have better connections: Beats Studio seems to have a much better connection than Sennheiser and B&O on Mac.

After all, I still love the idea of wireless because it is nice on paper. It is mostly the de facto protocols and implementations that are terrible.


I've found that sometimes fiddling with some wireless settings like operation band, channel, tx power, WMM, etc. helps, provided you know what you are doing. Especially so if you're in an overcrowded wifi zone or in a poor reception area.


>2.4GHz sucks, use Bluetooth when possible.

Huh? Bluetooth is 2.4GHz.


I'm sorry for not being clear on this but I'm not sure how to name it in English. I was referring to those wireless devices with a small USB receiver.


In the professional wireless development space, we call those "generic ISM band crap".


Bluetooth is way worse than good custom protocol dongles (e.g. Logitech "unifying receiver")


You're correct, it's just that vendors call their proprietary non-bluetooth protocols that.


I recently played doom 1 with my girlfriend on our home wifi (chocolate doom for those interested). Just 2-player LAN coop.

The lag made it unplayable. We plugged in some ethernet cables and it ran fine after that.


I will always preference wired over wireless, but I am currently unable to run an ethernet to my desktop (my partner is not keen on having a tripwire run straight across the house). So I did the same ping test that OP did and noted a few spikes, I had 10 pings over 250ms out of 600, all of these were distributed unevenly over time. Guess I could be subject to such wireless degradation. But I can't say for sure without trying an ethernet cable. t : ms 32: 1725.0, 33: 662.0, 68: 406.0, 100: 980.0, 101: 285.0, 312: 454.0, 342: 746.0, 464: 453.0, 480: 1852.0, 481: 841.0


I'm happier and happier with my wired Shure SE215s. I realize that wireless is super convenient, but for someone who doesn't use their headphones a whole lot - mostly for long conversations and videos, cables don't bother me so much. And another plus is I can just change my cables when they wear out (MMCX standard): less waste, less to throw away.

What does bother me is having to use a dongle with my Pixel 3; the Apple USB-C to 3.5mm adapter is thankfully better (and smaller!) than what Google shipped with the Pixel 2, which suffered from really annoying echo problems.


I bought a wired Jabra headset for this pandemic times, but nobody actually cares - and we chat without video. Went back to the cheap BT headset - there are too many advantadges of this freedom: can make a coffee, get the baby... while "in the meeting".

But the real most important point: I want to bet on the wireless future because I hate wires. I love seeing my table clean when I had a wireless Mac keyboard. And if Apple can make a keyboard that works reliably over BT for several months, then that should be our target...


About the network polling addressed in the article, why does any app even need to poll for networks? On every computer, there's only ever one single app that needs to poll for networks, and that's the app that maintains your network connection. For every single other app, network should simply be considered a transparent service that they simply use and don't control. They have no business looking at other wifi networks. They don't even need to know whether they're on wifi or ethernet. Maybe they can ask the network app how crowded the network is, or whether whether they're free to use as much as they want or if they should tone it down a bit (which could be either because it's too busy, there's not enough bandwith, or because it's a metered connection), but other than that, they don't need to know.

Lately I keep running into these sort of weird, stupid problems in OS design, and it's making me want to write my own OS (which I won't do, because I lack the skill and knowledge).

I agree with much of the rest of the article too. I've always been annoyed that Apple doesn't sell a wired touch mouse. I love the touch mouse, I just don't love running out of battery while working. And I certainly don't like the way their more recent mice need to be charged with a wire inserted at the bottom! Who ever thought that was a good idea?!

In some situations, wireless can be very convenient, but when you're not walking around, wired is usually better in many ways.


> However, every time I can remember helping someone track down the source of their connection problems, the culprit has turned out to be their wifi.

This is not my experience. I’d estimate wifi to have been the problem only 20–40% of the time. I’ve found a poor ISP to be the problem much more commonly. And then the last few percent of cases is other miscellaneous network topology problems typically from failing wired hardware, buggy router or dying dumb switch.

(I’m excluding “too far away from the router” from my reckoning, because in such cases that’s been known to be the problem, and it feels unfair to criticise wifi for that unless you’re going to criticise ethernet cables for not being where you want them. Notwithstanding this exclusion, I have found that being near the edge of a wireless network’s reach is great at messing with both the router and computers; where I am at present, if I go a few metres further away from the router, my Surface Book’s wifi connectivity stands a decent chance of kinda breaking after a while, so that it starts losing 5–30% of packets and getting average round-trip times to the router of 150ms—which together make it very close to completely unusable—until I next restart the computer (restarting the wifi adapter isn’t enough). I haven’t determined who’s at fault, the router, the adapter, or Windows 10.)


That was a great story about actually managing to debug a wireless problem. I wish there were better diagnostic tools commonly available that would clearly explain these things. I'm not sure if in the author's case there is some software that would listen for extensive polling and flag that as an issue - even better if it could log running programs at the time and try and guess which one is causing the issue.

I'm constantly having problems with my mac laptops and after many hours (10+) of internet searching I still have no idea why the wifi doesn't always work reliably. Some days or weeks (months?) it is great. Other days every hour it is disconnecting. Sometimes resetting the router helps, sometimes it doesn't. The whole situation is extremely frustrating.

Some issues: Why am I getting a DNS issue when my wired desktop never has a wireless issue?

Sometime I swear that the laptops becomes much less reliable at the opposite end of my house as my router but the signal strength is generally still excellent (4 bars).

Anyway, I'm not asking to have my particular issues solved (although that would be much appreciated!!!!). The real issue just seems to be that debugging these issues is extremely difficult and not based on principles but just random things that people can try (e.g., delete your plist files).


The other interference problem is that all devices talking to a particular access point are all transmitting on the same frequency. As wireless is a shared transmission medium they all have to take turns and if two or more devices happen to try to transmit at the same time they'll corrupt each other's message and have to go through the random exponential back off to try to find clear air, leading to random changes in latency & throughput.


>all devices talking to a particular access point are all transmitting on the same frequency

Not necessarily. I just bought a Triband wifi6/ax router for like 150 quid.

5GHz band A -->> Main computer

5Ghz band B -->> iPhones & tablets & laptop

2.4ghz band -->> Everything else that is potentially noisy (ahem sketchy IoTs)


I exclusively use a wireless mouse. No latency, never have to charge it or change the batteries.

I like wireless but I hope there's a better interface than Bluetooth at some point.


I use a wired mouse on my desktop with 1000 polls/sec and latency of about 5 ms.

On my laptop (which is 9 years newer and has similar benchmark results) I use a wireless mouse which has 125 polls/sec and latency of 20 ms.

Yes, one can tell the difference. It is possible to buy low-latency wireless mice. Next time, I will.

Bluetooth is a trash fire. I was hoping wireless USB would take off, but that's dead now.


The difference is the screen refresh rate.

On 60Hz my wired and wireless mouse feel exactly the same.

On 240Hz, yeah, there is a lot of difference.


Same here. My first one was an early Naga (around 2011) which could do wired and wireless. I loved having the battery in it, so it was a bit heavier. Unfortunately it wore out. The feet got a lot of wear and tear. Later Naga were only wired, and I missed it.

Now I have a Logitech G900 with a Logitech PowerPlay. So its always charged. Obviously, it has less buttons/keybinds than a Naga.

On Macs, I like the Apple Magic Touchpad. It can do wired and wireless, and the battery works for ages.

Any of these would also work with a magnetic cable. You'd plug it only when you'd have to charge it.


I've vastly improved the quality of my wifi by upgrading my routers and my devices.

The important thing to remember is that wifi always performs as the worst device on your network. A dead-zone device hurts every device on your network. A G-only device hurts every device on your network.

So, 2 routers in my sms house, hooked together wired. All my priority devices have new dual-band hardware and are on the 5ghz network. Old devices are relegated to the 2.4ghz network, which is quite spotty owing to that.

Unexpected things can do real damage too. I had an older PC that had its wifi dongle plugged into a USB 2 port and it would frequently get multi-second ping spikes. The same wifi dongle worked fine on a USB3 port. I ended up wiring that one in.

Personally I've abandoned all wifi hardware companies other than Asus - every other one has let me down at least once, if not repeatedly.

In every other respect, Asus seems to be incapable of making devices that rate higher than a B plus, but in the lowered expectations of the world of wifi hardware companies where "occasionally functional" seems to be considered good enough, Asus' spotty quality is a huge step up.


Bluetooth - yeah fully agree. Having endless troubles just getting the headsets reliably connected. And my standing desk seems solid enough to physically block bluetooth signals which is an issue if tower is below desk & mouse above.

Wifi on the other hand I find pretty stable. As long as you're on 5ghz. Haven't gotten the new ax/wifi6 to work yet though despite buying relevant tech


On the topic of wired vs wireless... Has anyone had experience doing anything ridiculous such as getting a 40Gbit Ethernet adapter for say, their macbook? https://eshop.macsales.com/item/ATTO/TLNQ3402D00/


Kind of amazing that a macbook pro has four 40 Gb/s ports... and none of them are ethernet. Compare to 1 Gb/s ethernet which was rapidly adopted by Apple laptops as well as desktops. (Though you can currently get a Mac mini (!) with 10 Gb/s ethernet built in - if you can find something to connect it to.)

I've used thunderbolt cables to connect laptops together for fast data transfer. I think in order to get decent performance I had to get disable the software ethernet bridge.


> ... with 10 Gb/s ethernet built in - if you can find something to connect it to.

Just as a data point, it's not expensive to add 10GbE (or higher) to a desktop or server/NAS if you're ok with getting 2nd hand gear from Ebay.

Gear from Mellanox (specifically) is pretty robust (eg rarely bad), and the ConnectX-3 and above series are dirt cheap. eg:

https://www.ebay.com/itm/MCX353A-FCBS-MELLANOX-CONNECTX-3-40...

Note, I don't know that seller at all. It was just one of the first items in the Ebay search results. :)


Those are Ethernet ports, just SFP+ ports rather than the RJ45 copper you're probably used to. I've never seen RJ45 ports > 10Gbps. Even 10Gbps is more often than not SFP+. Even if Cat8 can theoretically can 40Gbps, it wouldn't really make sense to put copper for those speeds today IMO.

This device looks to be target for network engineers, who can use it for testing and benchmarking infrastructure, or for installations that will most likely be in or next to racks rather than on desktops.

You could just pop a transceiver module that will convert the port to RJ45. The price of those is peanuts compared to the 2k list price of this device, and the reverse isn't viable.


IIRC each side of the macbook pro only has a single Thunderbolt controller. So it's actually 2x 2 ports sharing 40Gb/s. If you need more than 40Gb/s, for say dual external gpus, you'd need to use cables on opposite sides.


>and none of them are ethernet.

Might have something to do with the fact that 8P8C is as thick as the entire laptop.


That can be easily worked around with flat retractable sockets PCMCIA Ethernet cards had in the past.


You mean this? https://twitter.com/Infoseepage/status/938837890494169088

Looks really bad for longetivity.


Wow you would think that article was written in 2008!

I have had a very good experience with 802.11ac for all applications, and Bluetooth 5.0 (iphone 11 pro), and Bluetooth 4.2 with apple keyboard and mouse. Wireless headset to a macbook is pretty bad, pretty awesome to the iphone. I do not have dongles.

Knowledge of interference and the wireless devices you use is important.

I spend a non-negligible amount of time when choosing a wireless router, and will immediate benchmark it. The routers from the ISPs are still pretty bad in comparison to the ones you can buy for $70 - $120.

If I am using a desktop I will also benchmark the dongle.

You also have to acknowledge proximity to power cords and transformers, which will degrade your experience.

It always requires a holistic solution to optimize and I don't know if the writer's problems can be fixed. Even two years ago I kept one or two devices on Ethernet if they were close to the router. But I have since removed them.


Any recommendations?


After going from a house to an apartment, when we moved back into a house - it was a new build - we made sure that every room was wired for gig-e. We didn’t want to take any chances on wireless. My wife uses an iPad with a 6 foot stand to teach online fitness classes. She even got an Ethernet adapter for the iPad.


My opinion on WiFi is that as long as the end user(s) have a decent router/access points (AP), it's great. If the end user has a crappy $25 router or are trying to cover a ridiculously large area with a single AP with a very sub-optimally placed AP (e.g. think bottom of a closet), then yeah they're gonna suffer. Almost no one needs a $200 prosumer router nor a Ubiquiti setup, but everyone needs something better than a $25 router. My experience is that the sweet spot is around $70.

Bluetooth is definitely a disaster though. Apple has made great strides with their AirPods and the (It Just Works)™ philosophy behind their products, but even the AirPods aren't perfect. Once every month or so, my AirPods just refuse to connect for whatever reason and I have to restart the device outputting sound to get the connection working again.


Yes, latency and quality degradation suck.

But in most cases, for most people, wires suck even more.

Seriously. I can't walk around my apartment to tidy it up while on an audio call on my computer if I'm wired. When I walked around the city with wired headphones to my phone, the wires would constantly snag or something or other. Wires are constantly getting tangled, or I have to spend time carefully rolling and clipping them and unclipping them and unrolling.

Wires suuuck. They suck baaad.

If I'm doing something professional with audio or AV where quality/latency are paramount, then sure I'll use wires and dongles as necessary. They're a "pro" thing.

But for everyday use? Wireless is still generally good enough, and the pros waaay outweight the cons. Which is why the world has been switching en masse.


I bought a pair of 190 Euro Sennheiser HD-25 II roughly 10 years ago. I had to swap my cable once in these 10 years which costed me 35 Euros (and the fact that after ten years you still can buy that cable speaks for itself).

In the same timeframe my girlfriend went through roundabout 400 Euros of cheapass (or just cheaply manufactured but overpriced) headphones and earplugs, both wired or bluetooth. Some lost, many broken etc.

And I nearly used my headphones everyday, in the subway, when jogging, as a DJ on the stage, for sound recording on film sets standing outside in winter during rain. I stepped on them more often than I am comfortable with.

Wired headphones aren't bad — shitty wired headphones are.


I completely agree. Wireless works for me 99% of the time, and it's really, really convenient.

I was a Bluetooth headphone holdout, but getting a set completely changed my working life. With wired headphones I'd forget they were in and pull half the stuff off my desk when I got up. The wires frequently broke. The absolute convenience of a pair of good Bluetooth headphones is totally worth the latency. It's not even that bad on my plantronics over ear headphones - certainly usable for calls and TV. They've been an absolute life saver for video calls when there's background noise.


It's a tradeoff. Like everything else, it depends.

I'm not wiring up my house for ethernet any time soon, I found a 5g band that performs well and can call it good.

But I've all but given up on wireless headphones. If I had a dollar for every time my airpods died or disconnected and refused to reconnect during a video call, I'd have enough dollars to buy new airpods now that their batteries are nearly shot. There's still a time and a place for them, but it mostly involves walking the dog.


It does frustrate me so much how wireless things can be finicky.

For example, my AirPods Pro worked flawlessly, except for about one month where they'd have a hard time switching between devices. But then it fixed itself.

Separately, AirDrop had always worked flawlessly between my Mac and iPhone and iPad, for years. Except two weeks ago it stopped working entirely between any of them. Tried rebooting everything, no help. Still won't work.

I've never gotten Sidecar to work, even though I meet all the requirements. On the other hand, my Bluetooth keyboard and mouse have always worked flawlessly every day I've ever used them.

It's all so random and luck of the draw... and you can never tell if the problem is because of Wi-Fi, or Bluetooth, or device software settings, or device hardware configuration, or settings on your Apple ID, or what.


Is it possible to use the perf tools(http://www.brendangregg.com/perf.html) and debug top down to find the program that causes the issue rather than guessing the programs and debugging bottom up ?


Ugh. I had the exact same issue with Qt and it took me forever to track it down[0].

Even though it's been fixed since 2017 I can confirm that there's still lots of network-using software out there that isn't using the most recent Qt library. Installing any of it will destroy your ability to have lag-free video chats or do any other real-time streaming.

I've always been in the "wires wherever possible" camp and it's things like this that are keeping me there. Hopefully headphone jacks don't go totally extinct anytime soon...

[0] https://cmetcalfe.ca/blog/diagnosing-periodic-high-ping-time...


I mostly agree but would say wired is a better technology than wireless in terms of safety, reliability and cost. I only use wireless for an occasional phone call usually on the road in case of car trouble, with my keyboards and with my Visonic home security system sensors.

Timothy Schoechle of the National Institute for Science, Law & Public Policy in Washington, DC recently wrote a paper called "Re-Inventing Wires: The Future of Landlines and Networks" on this subject. It's available as a PDF download from https://gettingsmarteraboutthesmartgrid.org/pdf/Wires.pdf


I still prefer a wired connection, but I don't think wireless is evil. The main problem is people use crap equipment, most likely provided by the ISP. Most of this equipment has chips that overheat a degrade the performance. Most of the ISP only provides AIO router+modem combos. People use it instead of buying a good router+wireless access point.

This reminded me to this article I read a few years ago: https://arstechnica.com/gadgets/2015/10/review-ubiquiti-unif...


I wanted to point out it isn't really "Wireless" fault. It is the problem of Consumer WiFi and Bluetooth.

I could get a faster Speedtest transfer from 4G on my iPhone 11 then standing next to my WiFi 6 router. And my WiFi 5Ghz actually has more Spectrum ( but slightly more noisy as I live in apartment ). And in terms of reliability 4G works better than even the best "Enterprise" WiFi network. And these Enterprise WiFI network gear ( Ruckus , Aruba etc ) are already zillions times better than consumer grade WiFi.

You then have ISP wanting to sell you Wireless Router combined with their ONT / Modem. If they actually provide something better then I would be happier. But they dont. The only good thing is that you now only have to restart one devices rather than two to test or "reset" your network. I think I recently learned in Germany the law mandate consumer to have the right to buy their own ONT / Modem. So you dont have the crap that most ISP give you due to Cost reason.

There is no reason why we cant have standalone 4G / 5G in the Unlicensed Spectrum. Qualcomm actually made a case with MultiFire [1], unfortunately due to politics and interest it may never actually take off.

WiFi 6 / 802.11ax was suppose to incorporate lots of learning from 4G/LTE/3GPP Rel into it. But as is typical of design by committees and vendor's interest we now have WiFI 6 that doesn't support most of the original promised features. That is why you start seeing marketing push as WiFi 6+, which is different to WiFi 6E. If all these mess sounds familiar, yes it was called USB.

That is one reason why I really like Apple's MiFi programme and standards. Stop giving options and choices to vendors. Stop giving them margins of error in the name of cost reduction. If you are going to make 100 Million unit a year, stop trying to cut the cost down by pennies, up the BOM by a dollar. Make something better. I will paid. Lots of people will paid. And one billion of iPhone user will paid.

I want quality, not crap.

( Ok I have gone off topic )

[1] https://www.multefire.org


> Dongles. Even though all computers now have built-in Bluetooth, many Bluetooth accessories today still ship with proprietary dongles. I assume this is because the manufacturer was worried about inconsistencies or incompatibilities between their own Bluetooth implementation and your computer’s built-in Bluetooth hardware/drivers.

No, in most cases where you see dongles (keyboards, mice, gamepads) it’s because the dongle is not speaking Bluetooth to the device, but rather a “raw” pre-paired fixed-frequency RF wire protocol. Devices connected by such dongles (usually marketed as just being “wireless” rather than being “Bluetooth” devices) are basically electrically connected to your computer—just with an RF-modulated bridge stage for the electrical signal path. There’s no “wireless controller” or “modem” in these peripherals; they’re just letting the signal path flow out the antenna.

The disadvantage of these (besides the inconveniences of a dongle) is that these “raw” RF protocols provide no consideration for interference with one-another, besides maybe being e-fused to each operate on a different randomized sub-channel of the commercial-use 2.4GHz band. This means that you can’t have very many of these devices operating in the same “shared medium” (e.g. the same open office); and in fact, a channel collision for these devices won’t just interfere with one-another; they’ll often—lacking any device-ID header or per-device encryption key—just plain interoperate with one-another, with your “wireless” keyboard dongle picking up the typing of some coworker’s “wireless” keyboard! (They’re a lot like RF TV remotes in this regard.)

Note that devices that market themselves as Bluetooth but also come with a dongle are either 1. lying, and don’t actually use Bluetooth; or 2. have Bluetooth and “wireless” as separate modes. There are good reasons to offer both as separate modes:

• Compatibility. “Wireless” devices just look like USB devices, so you can use them to e.g. config your BIOS; or talk to any machine that can speak USB1.0, e.g. some old Win98 beige box. And plugging the dongle into a KVM is just like plugging a USB-connected device into a KVM; you can switch your keyboard’s “focus” between hosts using the KVM, without the host itself needing to re-pair with the peripheral. Switching Bluetooth peripherals around by having the Bluetooth controller on the KVM is much more fraught process.

• Battery life. Bluetooth, at least before BTLE, burned energy to a far greater extent than the “wireless” protocols—mainly because the “wireless” protocols aren’t spending any energy on background activities like identity announcement for re-pairing, or frequency-hopping for better SNR. (This is why you see “wireless” peripherals that last months on AAAs, but all Bluetooth devices shipping with Lithium cells: the Bluetooth peripherals need charging frequently-enough that the number of AAAs they consume would be untenable.)

• Latency. No e-sport player would ever use a Bluetooth peripheral, since the Bluetooth input path often adds one or more in-game frames of latency (relative to the USB input path), before the input hits the game’s physics engine. “Wireless” peripherals have no such problem.


Thanks so much, this is a really useful correction!


The other interference problem is that all devices talking to a particular access point are all transmitting on the same frequency. As wireless is a shared transmission medium they all have to take turns and if two or more devices happen to try to transmit at the same time they'll corrupt each other's message and have to go through the random exponential back off to try to find clear air, leading to random changes in latency & throughput.


> That means in dense areas (e.g. apartment buildings), routers will often ... interfering with each other.

Obviously the radios wouldn't interfere with each other if we could prevent the radio waves from leaking into other apartments. If only we had something that could be added to the router that would guide the radio waves directly to the wireless devices instead of indiscriminately radiating in every direction. Some sort of wave guild.

/lol


what's wrong with wireless keyboard and mouse? battery life is like one year and I don't have also any issues with response times (not gaming though)

but yeah, I don't see myself using wireless headphones which would not be perfectly in sync with video and I am using wifi only because installing cable through whole apartment would be too troublesome and not sure if network over powerline would be faster and more reliable than wifi


Personaly I don't understand the advantages. I've never had an issue with the cable that runs from the keyboard over the side of my desk. Maybe for the mouse as it moves around (although I usually use a wired trackpad) but I'm not too convinced. Maybe if I tried using one for an extended period of time I would change my mind. In general I'm not too worried about the reliability in the wireless sense but I don't like the idea that these things can run out of battery. I like as many of my devices to just work without any maintenance required. (even if the maintenance is minimal)

Furthermore I generally don't have batteries on hand. (Right now the complete list of things I own that use batteries is the TV remote and the thermostat) So when I get the low battery warning I need to go and buy some.

On the other hand I completely get the case for Bluetooth headphones. I love being able to run to the kitchen without losing my music or still being able to hear someone else speaking. Also when using them with a phone the cable running down to my pocket does actually get in the way. Synchronization shouldn't be an issue because any not-terrible Bluetooth speakers will report their delay to the computer so that everything remains syncronized. I do have one set of cheep earbuds that don't report it to my phone correctly but this is rare (and I probably should have sent them back).


The main advantage for me is that I can pack it away when I need more desk space. (eg.: planning something on paper)


When we first opened our office we had a small kitchen interestingly when we turned it on the wifi access in the whole office went down for the duration of the microwave running. Turns out if you have 2.5ghz WiFi it can be interfered with some microwave ovens that maybe have poor insulation... also the signal meter on the computer measures the signal as strong the whole time but packets drop like crazy


Regardless of the product advertisement Jabra 75 does NOT support native Bluetooth connection to computers (although Jabra is not quick to admit it) https://medium.com/@daniel_36042/jabra-we-know-our-bluetooth...


This depends a lot on your particular situation and equipment. I ran the ping test exactly as shown, with wi-fi and wired, and there is negligible difference in average latency or standard deviation.

Even when using ethernet, the PC connects through an Orbi mesh router, two hops away from the base station. A round trip to local IP addresses is between 9-10ms. Switching to wi-fi adds 2-3ms to that.


They mention that the polling increases latency. But by how much?

Is windows affected by this? Because I've never noticed it have an impact on video calls.


It mostly depends on the actual network hardware, drivers and the access point, not the OS.

When a WiFi device does a 'scan' (which it needs to do to know if there might be a better AP to connect to, amongst other things), it has to retune the radio to another frequency. That means it can't listen on the original frequency, and packets get lost. Depending on the network design, they might be retransmitted from the access point (layer 2 retransmit, a few milliseconds), or by TCP after a timeout (1 network round trip, hundreds of milliseconds often).

A third option is your laptop tells the AP that it is going into a power saving mode and won't be listening. Packets are then queued for it while it retunes and does a scan.

Best would be to have multiple radios, one whose only job is scanning. Then there would be no latency impact at all.


One of the worst parts of being a network-worker in a University, is managing the wireless and the expectations of how wireless should perform at every square inch of the campus. I understand the frusterations of the students, but at the same time learn a bit about the pros and cons of the technology you're so dependent on.


I ran ethernet cable in hopes of eliminations video call dropouts, but I still have plenty of issues.


You may have local issues. I ran a series of iperf3 tests in a small but non-trivial network and was surprised. There was a 100Mbps switch that had been forgotten about. Also a wifi network running off a Power over Ethernet link which was 3Mbps compared with 50Mbps over fairly poor Wifi it was supposed to be superior to, which looks pretty darn good to the POE link. A cheap switch and two LAN cables later things are so much better.


100 Mbps is enough for 15-20 HD video calls using modern codecs. Gigabit is great for backups, but for ordinary use 100 Mbit is fine.

If you're getting lag and dropouts with a wired connection, the problem is most likely bufferbloat.


Maybe it's due to the other side of the video call, using their wifi?


To windows 10 users with bad wifi experience: windows power management has become more agressive, it caused dropped VPN connections continuously until I put power management to High performance for all power profiles. It took me aLmost a year to figure out the cause.


Wireless transmission is pretty interesting. When doing the OSI work and getting to the 'physical' layer for WiFi, I was surprised to see how transmission worked and how often you'd have conflicts when broadcasting packets on the same frequency range.


The issue with WiFi is over-crowdedness. Suggestions to fix this are to actually get everyone to lower the power on their wifi transmitters (hard to control). Also, making sure youre on the right non-crowded channel (easier to control).


Ive wanted an alternative for 1-wire since I really do not need very quick updates, but I do want really long and questionable wiring! This looks like a perfect fit!


I agree, wireless Internet is pretty terrible.

However, I find that I prefer everything else wireless.

I wish there were more solutions for mechanical wireless keyboards, in fact.


Agree. I have a lot of cables around my desk.

It’s not just video calls. My backup is done to a Synology NAS, upstairs. That’s all wired.


Switched on friday back to wire.

My VPN was acting up and my video calls as well.


I don't like wireless either, for many reasons (the reasons they list there are only some of them; there are more).


Yah, the main thing he doesn't mention--and in most households, it's the primary thing--is retry hogging by devices that have a poor connection to the access point.

High end access points use a time slicing algorithm (tdma-ish) and 802.11ax (Wi-Fi 6) mandates something along those lines. So Wi-Fi will eventually improve a little.

But for all the household all-in-one modem-router-switch-accesspoint boxes, one device with a poor connection can hog most of the access point's time, doing re-transmits of corrupted data.

If you're video calling over wi-fi, turn off every other device that uses the wi-fi.


>High end access points use a time slicing algorithm (tdma-ish) and 802.11ax (Wi-Fi 6) mandates something along those lines.

You will be surprised to learn not every WiFi 6 has support of OFDMA turned on.


Forgot to mention: -

If you still have problems after switching to wired, it's almost certainly bufferbloat. (Avery Penwarr has a nice writeup about this, getting usable video calling with his parent's low-end ADSL connection.)


And try getting a home builder to install Ethernet. They just look at you with a blank stare.


In France ethernet with RJ45 is the standard that replaced the older phone line that we were running in the walls. Since you can plug RJ11 to RJ45, you can use a

Any decent electrician is France knows how to install ethernet, and they can get install this kind of equipment, equivalent of a distribution cabinet, instead of the regular "phone line" box:

https://www.tableau-de-communication.fr/coffret-hager/13-cof...

https://www.tableau-de-communication.fr/coffret-hager/14-cof...


When I had work done in my house I just made sure that they knew that before they boarded up a particular ceiling I needed to run a cable. In the end we all ran the cable together, and now I have a 1Gb/s Ethernet connection to my office.


My house built 16 years ago has 2 4-pair cables of CAT-5e run to every room for phone wiring, standard from the home builder.

The blank stare was that I asked for conduit and pull-string everywhere, and they weren't willing to do that. Or to do any variants beyond the standard 2 cables to each place, so my office has 4 ethernet ports (adding another "2 cable jack" was something that they could do) and needs its own switching.


Depends where you live. Here in Canada almost all new builds have cat6 run to each room to a central closet.

Problem now is that most electricians think that because "it is just low voltage wiring" you do not need to worry about interference and can just run it anywhere.


maybe try a different builder then.

those are cables and sockets, standardized to the point of beeing a commodity. no magic involved at all.


You shouldn't run it parallel to the electrical line, because it will generate interference. Other than that it's pretty straightforward.


Url changed from https://www.lesswrong.com/posts/8hxvfZiqH24oqyr6y/wireless-i... to the canonical source.


You're just using the wong kind of wireless. I only use the best: copper.


Not fibre?


I avoid wireless because of EMF sensitivity that my wife has. Just to be sure, I've secretly switched on the wifi some days to see if she mentions any symptoms and definitely she does. I personally haven't felt a difference, but it is fun running a wired network - switches all throughout the house and gave me the motivation to set up Pi Hole and a print server.


Here we go again: https://en.wikipedia.org/wiki/Electromagnetic_hypersensitivi... "The majority of provocation trials to date have found that such claimants are unable to distinguish between exposure and non-exposure to electromagnetic fields. A systematic review of medical research in 2011 found no convincing scientific evidence for symptoms being caused by electromagnetic fields. Since then, several double-blind experiments have shown that people who report electromagnetic hypersensitivity are unable to detect the presence of electromagnetic fields and are as likely to report ill health following a sham exposure as they are following exposure to genuine electromagnetic fields, suggesting the cause in these cases to be the nocebo effect." Wikipedia article has links to scientific studies.


One point I am wondering about is if the symptoms could be caused by an actual disease (unrelated to EM) that would need treatment, but is not because the affected person is convinced that the cause is EM exposure.

However, I guess people usually turn to unproven treatments once medicine failed several times to treat their symptoms, so they likely already went to a regular doctor several times beforehand, ruling out any easily discoverable disease.


Eh, I don't really care. I am not harming anyone by offering this for her. If she feels better and I have fun then it's a good time.


> I am not harming anyone by offering this for her. If she feels better and I have fun then it's a good time.

Some would say you're harming your wife's understanding of reality, but that's a private matter for you and her.

However, by sharing it on the Internet, you are potentially spreading this bullshit, and you must expect reasonable voices to counter your claim.


Reasonable voices would not be so quick to comdemn. Please try to understand that we are just beginning to discover the non-thermal effects of electromagnetic fields, and the science is not looking too good for the wireless industry, which will fight tooth and nail to discredit any findings that would cut into their billions.

https://www.researchgate.net/publication/323998588_Wi-Fi_is_...

Wi-Fi is an important threat to human health https://www.researchgate.net/publication/305691437_Electroma...

Electromagnetic Fields Act Similarly in Plants as in Animals: Probable Activation of Calcium Channels via Their Voltage Sensor

https://www.researchgate.net/publication/283017154_How_to_Ap...

How to Approach the Challenge of Minimizing Non-Thermal Health Effects of Microwave Radiation from Electrical Devices


> https://www.researchgate.net/publication/323998588_Wi-Fi_is_.... > > Wi-Fi is an important threat to human health

Pall's methodology and understanding of EM has been critized broadly and repeatedly. Here's an "official" rebuttal to the paper you cite: https://www.researchgate.net/publication/328436881_Response_...

The rest of the papers you cite are by the same highly controversial biochemist.

At any rate, the discussion at hand is whether "EMF sensitivity" is a real condition. It patently is not. Anyone with such a condition would easily be able to collect the large monetary rewards that have been made available, and simultaneously immediately lift the social stigma they must feel. The fact that they do not speaks volumes.


Yea, cherry picking, me. Note that the rebuttal does not refute Pall's biomechanical thories, or say that there are no non-thermal effects, but that we need more thorough research, and that this research is not being funded. The precautionary principal (ignored in the USA, it seems), would indicate that we should be cautious and not quick to dismiss.


Thanks for sharing the research! You might like the YouTube channel Scottiestech.info Do you have any other way of connecting? Your profile 'about' is very interesting - it'd be great to talk more about hacking and planting trees. You can reach me at the email in my profile.


Hello triyambakam. I'm still setting up my email (moving from gmail) but you can reach me at my username @ protonmail.com.


> definitely she does.

That's not scientific.

That's not a double-blind experiment. Your may be more attentive to her mentions when you've secretely switched on the wifi. You may behaving differently and she may be picking up clues from your behavior. etc.

If you really want to be scientific, you need to do at least n=100 experiments to detect a 10% bias at 95% confidence level. (Z=2 E=0.1) [0]

If you detect such bias, your wife may be able to earn $1M prize for her supernatural abilities. [1]

[0] https://en.wikipedia.org/wiki/Checking_whether_a_coin_is_fai...

[1] https://en.wikipedia.org/wiki/One_Million_Dollar_Paranormal_...


> If you detect such bias, your wife may be able to earn $1M prize for her supernatural abilities. [1]

This snark seems unnecessary, given that magnetoception by bats and birds is unlikely to win said prize.


That's no snark.

If she can reliably demonstrate sensitivity to -60dBm RF signals (nanoWatts - the typical received signal strength of a Wi-Fi signal), that's the equivalent of having discovered a new human sense on top of the 5 known senses.

This discovery, or the explanation of mechanism underlying this sense, is worthy of a Nobel prize.


Sometimes when a persistent noise goes away I will realize that I've been slightly tense the whole time and all of a sudden I feel relief. Depending on what you mean by "switched on the wifi", could it be that there is some subtle piezoelectric noise that goes away when you turn of a device completely?


Indeed. I have sensitivity to transformer hum.


[flagged]


Great, thanks for letting me know!


Triyambakam, please don't take these criticisms much to heart. Smokers were parroting the tobacco industries claims that smoking didn't cause cancer in their own ignorance. There may be a similar thing happening now with the wireless industry. We are just scientifically beginning to discover the non-thermal effects of EMF radiation. If these fields do cause detectable changes to the pysiology of human cells, there is reason to believe that your wife could be sensitive to these changes.

Wi-Fi is an important threat to human health https://www.researchgate.net/publication/323998588_Wi-Fi_is_... Electromagnetic Fields Act Similarly in Plants as in Animals: Probable Activation of Calcium Channels via Their Voltage Sensor

https://www.researchgate.net/publication/305691437_Electroma... How to Approach the Challenge of Minimizing Non-Thermal Health Effects of Microwave Radiation from Electrical Devices https://www.researchgate.net/publication/283017154_How_to_Ap...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: