Hacker News new | past | comments | ask | show | jobs | submit login
ESP32-C5: Espressif’s First Dual-Band Wi-Fi 6 MCU (espressif.com)
129 points by extesy on June 22, 2022 | hide | past | favorite | 91 comments



Yay for 5ghz and Risc-v. Only other thing is ble with actual ble power consumption from Espressif.

Speaking of which, Nordic bought out that WiFi startup late 2020 but have been silent on their WiFi mcu since

https://youtu.be/PLdpg-YXhv0


It's still a single core CPU, though, whereas the Xtensa-based ESP32 has a dual core CPU. Not sure people are actually using that second core in practice though.. And yes, this is probably an excellent case where RISC-V can win; wouldn't surprise me if RISC-V already has better toolchain support than Xtensa despite being a much younger ISA.

Some benchmarks on the earlier ESP32-C3 at https://hackaday.com/2021/02/08/hands-on-the-risc-v-esp32-c3... ; it seems it has the same core as the new ESP32-C5 except it's clocked at 160 MHz instead of 240 MHz. Extrapolating those ESP32-C3 benchmark results out to 240 MHz would suggest it's a bit spiffier than a single core Xtensa ESP32 but not enough to catch the dual core xtensa.


It's extremely common to use both cores on the dual core Xtensa because you can bind ESP tasks (WiFi/Bluetooth) to one core and schedule your application code on the second core, reducing jitter visible to the application code.

The cores were even named Core0 - PROtocol and Core1 - APPlication in the initial ESP-IDF design, where the core affinity model was going to be more strongly enforced (thankfully they backed off on this idea and now it is fully configurable SMP).


I make displays out of discrete lights and panels, when I use dual-core chips I usually use one core for driving pixels and scanning continuously while the other core does network stuff and actual framebuffer content updates. Makes for a very glitch-free and seamless experience :)


Once RVV implementations start working their way down into MCUs, there will be little to no reason for Xtensa, this probably a 3-5 year time frame. The ESP32 is going a bunch of SDR with Xtensa VLIW/DSP core afaik. You would have to measure cpu cycle jitter under adverse wifi conditions to see if the single core is an issue.

I would assume that the wifi stack is hardware scheduled on that core with dedicated scratch ram, and that the rest of the cache and cpu cycles are granted to the user visible core only while not needed. I think a whole lot of applications didn't use the second xtensa core on the earlier parts.


I think developers using the Arduino platform on the ESP32 are mostly just using one core. Developers using the Espressif's SDK are more likely to use both cores.


RISC-V is a way more supported architecture than Xtensa for the single fact it has upstream support from LLVM. After years Espressif still hasn't managed to get its LLVM backend upstreamed, which means using Rust or everything depending on LLVM is a hassle.


They've indicated that there's multi-core risc-vs in their future ....


>Yay for ... and Risc-v

why? You as a user get 2x less computing power and celebrate it because vendor saved on licensing?


I'd like a version with ~1 MB of on-chip SRAM. 400 kB can be tight, because the wifi stack etc. is consuming a sizable portion of it.

And yes, I know about SPI PSRAM, but it has quite a bit of downsides.


It seems the main thing eating RAM is TLS. It's kind of funny we have these protocols like MQTT that are designed to be simple with MCUs in mind, but then use them over TLS which is way more overhead than say HTTP.

It would be interesting to have a secure communication channel that is simpler than TLS. OTOH I guess we can just wait and the RAM will get cheaper.


Encryption kind of sucks for these tiny IoT gadgets. I'd much rather have everything completely open and local only, and just not give out access to my network to people I don't trust.

It's always possible that another device hacks its firmware update, and leave a payload that then hacks a NAS or something, but that's a fair tradeoff to have devices that don't phone home to some server that could vanish at any time, considering that it's pretty unlikely unless you buy really bad no name stuff.

Plus, it can be very easily prevented by just not having a firmware update that can be accessed without a physical connection or button.

A lot of this stuff doesn't need updates regularly. It's just a light bulb. It doesn't have to be secure against people on the same network.

The cloud stuff is the stuff that gets obsolete or gets hacked, so it should go through a separate hub.


Well, sometimes you do want to control something directly over the internet, especialyl in industrial applications, like imagine you are monitoring a vehicle's location or temperature/humidity of a fridge in a supermarket.

My limited understanding of encryption/TLS and playing with microcontrollers, these devices can't have the full list of root certificates that we would have access to with our browsers / on linux. So you can't just go out and use any SSL/TLS certificate, you have to flash into the device either the signature of the cert you will use, or the root of trust you expect to use for the next X years.

Just look at Azure IoT Hub documentation right now:

> During a TLS handshake, IoT Hub presents RSA-keyed server certificates to connecting clients. Its' root is the Baltimore Cybertrust Root CA. Because the Baltimore root is at end-of-life, we'll be migrating to a new root called DigiCert Global G2. This change will impact all devices currently connecting to IoT Hub. To prepare for this migration and for all other details

It seems to me, that if you have these problems anyway, you might as well use Authenticated symmetric encryption - it is perfectly feasible to give each device a unique key, and you could use that until the end of days with much lesser requirements on the hardware. LoraWan uses symmetric equipment only.

Also, in the case of Fridge temperature monitoring, I don't even care if someone else can snoop on the data - I only care that the device is sending data to the real server, and not an imposter, so something like HMAC should work for that.


There are solutions to such problems, for example by using custom CAs + cert pinning or pinning cross-signed intermediates; if the Baltimore CA signed an intermediate together with the DigiCert CA then that intermediate can be pinned and used for as long as the Baltimore CA is considered valid by the device. Implementing DANE might also be an option, negating the need for CAs entirely at the cost of extra network traffic (and angering some very dedicated internet people who detest DANE for some reason despite its usefulness in such scenarios).

The RNG problem can be solved relatively inexpensively by dedicated hardware for good enough RNGs (all you really need is enough floating voltage to seed a PRNG and it should probably work well _enough_ in practice). Because of the limited data throughput of many IoT solutions, it's quite difficult to capture enough traffic to reverse the state of the RNG compared to desktop applications where you can capture many gigabytes of data per second.

Symmetric keys also work well if you can figure out secrets management. Managing many secrets can quickly become a pain, though, so it really depends on what kind of devices you use and how many.

But yes, I think most IoT data is pretty worthless for attackers, so it can probably be sent without encryption. If you hack into my network to monitor the temperature of my room then I don't know what to tell you, there are probably better targets out there.


Since the ESP32 chips support Wi-Fi, I assume they must have a RNG for Wi-Fi encryption to work properly, right? If so, is the RNG accessible to the application?


Yes and yes, the SDK has esp_random() and esp_fill_random() which hook into the hardware RNG.


I don't see why a fridge should talk directly do a server. That would be a lot of fridges to reflash or more likely replace the sensor if the server changes.

If they all go to a hub, all you have to do is update the hub, plus, the hub can do stuff like local alerts.

Symmetric encryption is probably the way to go for use cases that directly connect.

I wish there was something like an IPv9 (7 and 8 are probably taken!) with 48 byte addresses and a fixed layout. 8 byte ISP identifier, 4 bytes customer, 2 bytes subnet, 2 bytes device, and 32 bytes for a public key hash.

No more certificates, no more insecure LAN connections, just IP that's always secure no matter what as long as you have the right address, and even if you change ISPs you keep the key part of the address.


Most IoT stuff has no random number generator anyway. Without a random number generator you can't do TLS properly.

Without that, you might as well just use HTTP and stop pretending it's secure.


Can't you just gatuer some entropy from your environment? That would prevent most classes of attacks you care about


Couple of ways to do that.

You can feed the lower bits from a ADC converter into a hash algorithm. You can feed the RSSI readings from the radio as well. And finally newer embedded processors and some radio transceivers have built in random number generators. Helps to to flash each device with a unique random seed too.


Most little embedded machines don't have much environment to get entropy from. If you boot up and download your config from an https server, there is a good chance the whole machine state (ie. Every byte of ram) is identical to the last time you did that.


I consider RNGs basically solved. It's possible to do them wrong, and hardware backdoors could happen, but it's not like we don't have plenty of entropy sources on almost all platforms.


I've been working on little temp/humidity/air qual sensors on ESP32, and I've been feeling the MQTT+TLS pain as well. I was thinking I might create a separate wifi network just for these devices, with a rule on the router that only allows MQTT from the sensor network to the main network, and ssh from the main network to the sensor network. Then I can just get rid of TLS. Sure, the MQTT will flow in cleartext from the AP, through the switch, and to the MQTT broker machine, but I think that's fine.


What are your issues with MQTT+TLS in practice? If you can keep the TCP connection open then you shouldn't have too much overhead, should you?

You may want to look into the Matter standard, which already takes security into account and provides some non-certificate based security options. Such devices should integrate well with current and upcoming home automation products from tech giants like Google and Apple. Implementing it from scratch seems very very complicated, but it seems there are premade examples available for ESP32 and other such devices. As an added bonus, Matter can work over other physical protocols such as Bluetooth LE and 802.15.4 + Thread which may take some of the load of your WiFi routing.


If you have control over the software I think you can get significant size shavings by throwing out everything but the most recent TLS 1.3 protocol, use certificate pinning to not need the full CA cert list and remove things like session resumption, client side certificate authentication, you name it.

The MQTT overhead with the right security layer on top (i.e. 0-RTT TLS with extra checking on the backend to prevent replay attacks) should be quite minimal. There's a computational overhead for calculating the keys of course but the algorithm itself shouldn't need too many extra bytes. Similarly, CoAP + DTLS also lends itself to quite little network overhead.


Does TLS require lots of RAM? Surely just keeping a 16 byte key in RAM is all that's needed once the connection is established?

I imagine establishing the connection requires more RAM, but thats an event that can just use stack space, freeing it all up again when the connection is established. (assuming nothing async).


No ROM on these chips so RAM is both code and data


How much of that code has to be resident in SRAM or XIP cache once a session key has been established to handle the common TLS record types? Is it feasible to execute the whole TLS session setup code either in place from serial flash or at least copy it to SRAM on demand allowing the application to use more SRAM once a session has been established?


That's why I'm interested in what wireguard op these devices would do, there's a library for that, but haven't experimented with it yet.


Totally agree. 400kB is just not enough with wifi and BLE and more than 6 tasks.

And SPI ram cannot be used for stacks, so it’s incredibly limiting on your program architecture.


That has a limit of 8mb as well?


5GHz support on an ESP32 is something everyone has been waiting for for a loooong time. So, that would be good I guess?

But in general, 802.11ax is orthogonal to 'non-stop battery-based connectivity', so we'll have to wait and see how well that works out.

(Assuming they ship this one, and assuming anyone will be able to buy one in the next 2 years or so.)


Target Wake Time (TWT) in 802.11ax could actually substantially increase battery life for devices that need to remain connected and can't modem/deep sleep.


Yes, could, assuming that the AP supports it -- and nobody seems to be in a big hurry to implement that. After all, 802.11h has been standardized since, like, 2018 or so, with exactly zero mass-market support so far.

Absent a proper 'how to join the local Wi-Fi network' story, IoT connectivity is converging around LoRaWAN anyway.

As I said, I like the 5GHz support, but spinning it as 'power efficient' is a reach.


>nobody seems to be in a big hurry to implement that.

Just checked & my 2 year+ old APs do support it according to Asus website (Asus RT-AX92U)...so can't be that uncommon


The problem is, until nearly all networks support it, nobody can release a product that relies on it.

If you buy a battery powered IoT doorbell and it's box claims a battery life of 1 year, but then the battery is dead in a week because your router doesn't support the right 802.11 extensions, are you going to blame the router maker or the doorbell?


These things are always chicken & egg.

Has to start somewhere & seems like good progress


Really? Don't think I've ever seen a LoraWAN product in the wild.

Lots of ZigBee stuff, lots of ESP8266-based WiFi stuff (which generally requires a crummy phone app to perform the initial WiFi setup), and a bit of Insteon stuff hanging on despite whatever the heck is going on with the company.


LoraWAN sort of dominates agriculture IoT (the S is for Security) market. I guess it depends on your target area, but it is alive and very well.

Even if you aren't a blockchain fan, Helium[1] is still alive and well. It is helping spread LoraWAN even further. I live in central KY and have several helium networks near me. Never in a million years did I expect blockchain to end up on farms in Central KY, but here we are.

[1] https://www.helium.com


I have seen farmers using LoRaWAN, and The Things Network has pretty good availability in Europe. It's not really for smart home uses, more industrial/long range applications.


We have 10s of thousands of street lights using LoRaWAN amongst other things.

It’s very much being used! Just might not be that obvious.


I have a lorawan weather station, it's for outdoor long-range stuff


I'm curious, what station are you using?


Portions of 802.11h like DFS are widely supported by pretty much all manufacturers. Without DFS, many of the 5GHz channels are unavailable for use, so there are strong incentives for manufacturers to implement support.

Other WiFi features which allow power savings like APSD (802.11e) have widespread support too.


I see lots of LoRaWAN for sale, but nothing really using it. There's 802 series standards for WiFi-compatible-ish packet switched radio on the ISM bands, I suspect that will get more traction once it's finally in real products.


Strong resemblance to the ESP32-C6[1], which was announced a year ago & never shipped (from what I can see). RISC-V, Wifi6+BT5.0. 5GHz support is new & great to see; let's hope this one materializes!!

[1] https://www.espressif.com/en/news/ESP32_C6 https://news.ycombinator.com/item?id=26758050 (179 points, 110 comments)


The article says: 'The ESP32-C5 enriches Espressif’s Wi-Fi 6 solutions as a follow-up to the ESP32-C6 SoC, which was announced last year'


Odd naming scheme, aren't the version numbers supposed to increase?


They're not version numbers, they're model numbers. And the C5 seems to have fewer features/peripherals than the C6.


Espressif has been releasing a lot of chips recently (past few years) and covering a lot of bases.

I am looking forward to their ESP32-H2 in order to get Zigbee functionality.

Between these chips and Micropython (and other easy-to-use-on-microcontrollers languages), they are making home developed IOT devices easy to create.


Check this out: https://esphome.io

It was originally created for https://www.home-assistant.io, but has morphed into its own thing entirely.


The main reason these chips are so widely used is the low cost (well, and getting everything in one package including Wi-Fi).

Wouldn't Wi-Fi 6 require relatively expensive IP licensing? (Assumption.)


I think expressif just ignores licenses.


(I think so too.)

Then it becomes a problem for those that use Espressif chips.

I'm pretty sure there is a large volume of cheap/crap wifi controlled things being sold in EU/US by at least somewhat reputable retailers with Espressif chips inside.


Okay so I assume microcontrollers don't need the bandwidth, is this just about spectrum crowding on 2.4 and allowing people to run 5GHz-only networks?


I've experienced pains using mobile apps to configuring 2.4ghz IOT devices on dual band home networks.

This is more of a shitty router problem, but if everything is dual band this problem will go away.


All these new models have been single core chips. I do I²S mp3 + FFT encoding on an ESP32 and stream that over UDP, but am only able to do so because they are dual core. It would be sad to see that Espressif abandons this.


The recently released ESP32-S3 is a dual core, so it doesn't look like they are abandoning multi-core chips anytime soon.


> All these new models have been single core chips.

The new models have all been RISC-V-based. I wonder if the multi-core situation for RISC-V is somehow more difficult or more expensive, and Espressif is having trouble justifying doing it?


I suspect the RISC-V cores are quite a lot larger in silicon area. Remember that the ESP series needs to integrate analogue circuitry for the wifi stuff, so the silicon process isn't optimal for logic, which in turn makes silicon area even more valuable.


Espressif guy here: we're not.


AIUI S models are dual-core, C models are single-core.

And S models still use tensilica CPUs, but that will change over time.


> I²S mp3 + FFT encoding on an ESP32 and stream that over UDP

what does this mean, if you dont mind me asking?


You hook up a MEMS microphone and read the data (via I²S), encode it to mp3 and also to FFT data and stream the mp3 audio stream via UDP to a server, which can then be connected to via VLC player to play back that stream. The FFT data can also be streamed in a custom binary format, in order to have an real-time visual representation of the streamed mp3 audio.


I really wish Espressif would release a new chip with Dual BT (BR/EDR + LE) again beside the original one (which lack a USB phy). BT classic is still around for many application.


Take my money. I have been waiting for chip that can do 5Ghz.


What is the use case?

I’ve got a fair few and can’t think what I’d want 5ghz for, but every time I ask why people want a particular feature I’m impressed with that people are doing.


The FTC Robotics competition mostly uses two Android phones per robot: one for the robot and one for the controller. They use WiFi Direct to link up. For a while (and in some cases still) the phones only supported 2.4GHz. Congestion on the 2.4GHz bands may or may not lead to disconnects during competition play, so a general rule is no 2.4GHz networks. No hotspots, no access points, nothing.

Now in my use case I needed to deploy some cheap wireless devices to do things like queuing, automation, camera tally lights, etc. but couldn't because of that restriction, since many devices (Raspberry Pi, ESP32) only supported 2.4GHz. I ended up using the RTL8720DN or devices that contain it such as the Seeed Wio Terminal.


Cant you link over USB? Surely one of the phones is OTG.


OTG is used for the USB game controller(s) to the Android phone on the "driver" side. On the field, there's a second Android phone attached to the robot which receives the input from the driver side Android phone.


doh, somehow I missed "one for the controller" part. Sorry.


I think it's for environments where 2.4 is just trash and you'd rather not run any of it all, and your iot crap is the only reason you're force to still stand up a 2.4 AP in the first place.

This lets you go 5-only, and that's big for some settings.


I noticed yesterday how much traffic was on my 2.4GHz IoT network last night because my phone connected to it. TBH I have no problem with everything being on 2.4, I just wish it was common to have two 2.4 radios so you could have a true IoT-only network.


You can do that with two wifi modems.


> What is the use case?

I live in an apartment. Not even a super dense tower or anything, just a townhome style complex where everyone has their own garage and front door. From where I sit right now my phone can see nine networks on 2.4G channel 1 alone. 6 and 11 are around the same.

I want everything I can get running on the 5 GHz band and in the future the 6 GHz band. I've been holding off on new APs to upgrade from 802.11ac until I can get ones that support 6 GHz.


just because there are loads of networks that does not mean there is loads of traffic. I see over 20 networks from my laptop wifi, and yet all my smart home runs without a hitch.

What measurable benefit do you expect from moving to 5Ghz?


> just because there are loads of networks that does not mean there is loads of traffic.

Networks simply existing causes traffic, the beacon frames that advertise SSIDs are always transmitted at the lowest speed supported by the network, which on 2.4GHz consumer gear almost always means 1mbit/sec 802.11b. (this is the one actual real world benefit to disabling SSID broadcast, eliminating the small interruptions caused by beaconing).

Also keep in mind that 2.4GHz interference isn't just WiFi, it's also Bluetooth and decades worth of cordless phones, RF remotes, gamepads, keyboards, mice, etc.

Anyways, I use Ubiquiti gear at home, and one of the nice features it has is logging channel activity. On 2.4GHz channel 11 the utilization never goes below 25% and hits 75% regularly during peak home WiFi times. I have three smart lightbulbs and a bed on 2.4GHz at the moment, these devices have literally double digit megabytes of activity over months, so the traffic on the band isn't me.

On 5GHz channel 161 on the other hand I have three laptops, two tablets, two phones, two TVs used almost exclusively for streaming, and with all that my average channel utilization is below 10% with the spikes beyond that base level almost entirely correlates with activity of my devices.

---

I don't know what the actual real world difference is, but it's not exactly hard to make the case that moving devices to bands with shorter range and more available spectrum is a good thing for reducing the inadvertent interference caused by modern consumer tech.


decreased interaction latency.


Did you ever consider the RTL8720DN?


These chips (ESP32-S2,S3,C3,C5) all support Bluetooth LE but it looks like the original ESP32 is the only one that supports Bluetooth Classic. Does anyone know what is going on with that? Classic is what most Bluetooth devices actually use, right? Phones, earpieces, all that kind of stuff. Amirite?

(Correction: ESP32-S2 has no bluetooth at all, see dontknowmuch's response).


Classic BT is cancer by committee. Imagine complexity of IP networking stack, add Wifi - its still smaller, lighter and less complicated than BT stack. Nobody wants this headache.


Sure, no doubt, but there is a bunch of stuff that uses it that doesn't have BLE. So if you want to interoperate with that stuff there's no way around it.


You know how you could probably sustain yourself by eating only butter? You are going to puke a lot and hate yourself, but will technically be alive. This is how I see BT, especially used for Audio (latency, "quality", breaking connection, pairing). Might as well not try at all.

By the way in my earlier post I forgot USB. Everyone thinks USB is bad, BT is worse than IP stack + wifi + USB combined.


I can sympathize with that, but BT devices and software stacks are already out there in huge numbers. It's not even clear to me what BLE is good for, if it can't communicate with existing BT devices. USB3 at least didn't make you throw out your USB2 stuff. Is it even possible to buy a BLE keyboard, headphones, etc.? I guess some newer phones support BLE. My old one doesn't, I'm pretty sure.


FYI, ESP32-S2 does not have bluetooth at all.


I’m wondering how flexible the WiFi driver stack is? Is it possible to implement a custom MAC layer protocol?


ABOUT TIME!


I have yet to see a real use case for these IOT chips (other than adding wifi to your toaster or other innecesary things)


I have quite a few running at home:

Irrigation system driving valves through H-bridges

Relays driving ceiling fans

IR blasters (actually an additional feature of the previous relay drivers)

PIR sensors

Expansion of a smoke detector to send a push notification to my mobile

Garage door opener

Security cameras

(Quite a few are ESP8266 rather than 32's, depending on power and i/o lines required)


Can't mention any good use cases, downvoter?


I didn’t downvote, but there are lots of useful things hobbyists do with this chips. WiFi enabling homebrew CNC machines for example.

To which you may say, just buy a cheaper USB cable. That’s fair, but a homebrew CNC maker could have just bought one of the shelf as well. Part of the fun is making something exactly as you want it, the only trade offs personally imposed.

Adding WiFi to all the things is somewhat of a gimmick, but gimmicks are fun. ESP chips are fast, cheap, and easier to program than alternatives. In many cases for hobbyist use the competitor is a Raspberry Pi which is nearly always overkill.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: