I’ve got a fair few and can’t think what I’d want 5ghz for, but every time
I ask why people want a particular feature I’m impressed with that people are doing.
The FTC Robotics competition mostly uses two Android phones per robot: one for the robot and one for the controller. They use WiFi Direct to link up.
For a while (and in some cases still) the phones only supported 2.4GHz.
Congestion on the 2.4GHz bands may or may not lead to disconnects during competition play, so a general rule is no 2.4GHz networks. No hotspots, no access points, nothing.
Now in my use case I needed to deploy some cheap wireless devices to do things like queuing, automation, camera tally lights, etc. but couldn't because of that restriction, since many devices (Raspberry Pi, ESP32) only supported 2.4GHz. I ended up using the RTL8720DN or devices that contain it such as the Seeed Wio Terminal.
OTG is used for the USB game controller(s) to the Android phone on the "driver" side. On the field, there's a second Android phone attached to the robot which receives the input from the driver side Android phone.
I think it's for environments where 2.4 is just trash and you'd rather not run any of it all, and your iot crap is the only reason you're force to still stand up a 2.4 AP in the first place.
This lets you go 5-only, and that's big for some settings.
I noticed yesterday how much traffic was on my 2.4GHz IoT network last night because my phone connected to it. TBH I have no problem with everything being on 2.4, I just wish it was common to have two 2.4 radios so you could have a true IoT-only network.
I live in an apartment. Not even a super dense tower or anything, just a townhome style complex where everyone has their own garage and front door. From where I sit right now my phone can see nine networks on 2.4G channel 1 alone. 6 and 11 are around the same.
I want everything I can get running on the 5 GHz band and in the future the 6 GHz band. I've been holding off on new APs to upgrade from 802.11ac until I can get ones that support 6 GHz.
just because there are loads of networks that does not mean there is loads of traffic. I see over 20 networks from my laptop wifi, and yet all my smart home runs without a hitch.
What measurable benefit do you expect from moving to 5Ghz?
> just because there are loads of networks that does not mean there is loads of traffic.
Networks simply existing causes traffic, the beacon frames that advertise SSIDs are always transmitted at the lowest speed supported by the network, which on 2.4GHz consumer gear almost always means 1mbit/sec 802.11b. (this is the one actual real world benefit to disabling SSID broadcast, eliminating the small interruptions caused by beaconing).
Also keep in mind that 2.4GHz interference isn't just WiFi, it's also Bluetooth and decades worth of cordless phones, RF remotes, gamepads, keyboards, mice, etc.
Anyways, I use Ubiquiti gear at home, and one of the nice features it has is logging channel activity. On 2.4GHz channel 11 the utilization never goes below 25% and hits 75% regularly during peak home WiFi times. I have three smart lightbulbs and a bed on 2.4GHz at the moment, these devices have literally double digit megabytes of activity over months, so the traffic on the band isn't me.
On 5GHz channel 161 on the other hand I have three laptops, two tablets, two phones, two TVs used almost exclusively for streaming, and with all that my average channel utilization is below 10% with the spikes beyond that base level almost entirely correlates with activity of my devices.
---
I don't know what the actual real world difference is, but it's not exactly hard to make the case that moving devices to bands with shorter range and more available spectrum is a good thing for reducing the inadvertent interference caused by modern consumer tech.
I’ve got a fair few and can’t think what I’d want 5ghz for, but every time I ask why people want a particular feature I’m impressed with that people are doing.