>compared to a device that must constantly have the receiver powered up to listen
But there's no need for the receiver to constantly stay awake to listen or poll the transmitter like in wired network systems.
Low power wireless protocols use time slots since forever, where receivers wake up only in their dedicated time slots to check if any messages are addressed to them, and if so, then they wake up the entire CPU block and start processing the payload and reply to the message, but if not, then they put the receiver back to sleep till their next time slot. Simple and very energy efficient.
Therefore receivers are more efficient than transmitters as transmitters are constantly operating as beacons for every time slot which is what you want when the base station can be powered on AC, while the IoT receivers are usually battery powered and need to last for years.
The only tricky part is building a self compensation mechanism in firmware for the receiver wake-up time jitter as all receivers inevitably start to drift in time as per the drift of their oscillators, including the transmitter which also drifts, especially when using low-cost oscillators with horrible drift.
I do a lot with LoRaWAN, and I like the simplicity of its approach for class A devices, which is to wake up and transmit and then listen for a reply in two defined time windows. These are shortly after the transmission, so clock drift is less of an issue with cheap oscillators.
I'm not sure if a sensible timeslot based system could get down as low as the battery usage that something like zigbee battery powered sensors can get.
Those are transmitters running on battery, usually with the radio fully powered down, only powering them up to transmit if they need to report a changed value, or to report in at least once a day, so the hub knows they haven't died.
Something like a magnetic Reed switch door/winodw sensor would wake up the mcu via a level sensitive interrupt, so obviously the "sensor" portion for some sensor types can have negligible power draw.
How could any battery powered receiver possibly get power usage that low? Like even timeslots of only once a minute would seem likely to use considerably more power, and is likely too much latency for many purposes. But surely more frequent timeslots would only increase power drain?
I think this understates the complexity of getting the time slots right, since you have to factor in drift from the firmware, drift from the hardware (which varies based on temperature), and also the propagation time of the radio signals. Essentially, you want the timeslot to be much larger than the sum of all the jitters, while also making sure there are enough timeslots per system cycle to account for all of the devices, at the data rate you want.
Not to say its impossible to solve (although infinite precision synchronicity in distributed systems is impossible to solve), just that the more devices you throw at the system, the trickier it gets, in a way that does not scale linearly with the system size.
>I think this understates the complexity of getting the time slots right
It's tricky, but not impossible to solve by any half decent firmware engineer with low level understanding and some battle scars in the industry.
>Essentially, you want the timeslot to be much larger than the sum of all the jitters
Not really. Drift is inevitable but it's not so bad that you get huge fluctuations so quickly that you need to take such wide margins. Just sync all your receivers to the drift of the transmitter every few minutes/hours or so and you'll be fine. Depends on environmental conditions of course which you should know up front when designing you product.
>just that the more devices you throw at the system, the trickier it gets, in a way that does not scale linearly with the system size
Not really, you just sync all receivers to the drift of the base station via the same drift compensation algo. If it works on one device it will work the same on all. Of course you will reach a number of devices limit based on the max time slots you can have which is based on the amount of bandwidth you have and the access time you want for rach device slot. We got a couple of thousand device on one base station lasting ~1-2 years on one button cell so it was good enough.
If you make the addressing in AM, you can use just the radio waves to power a decider that can wake-up your device, and then you do the actual communication in FM.
But the wave can't be too faint, so I guess you co always need a bit more power at one side.
But there's no need for the receiver to constantly stay awake to listen or poll the transmitter like in wired network systems.
Low power wireless protocols use time slots since forever, where receivers wake up only in their dedicated time slots to check if any messages are addressed to them, and if so, then they wake up the entire CPU block and start processing the payload and reply to the message, but if not, then they put the receiver back to sleep till their next time slot. Simple and very energy efficient.
Therefore receivers are more efficient than transmitters as transmitters are constantly operating as beacons for every time slot which is what you want when the base station can be powered on AC, while the IoT receivers are usually battery powered and need to last for years.
The only tricky part is building a self compensation mechanism in firmware for the receiver wake-up time jitter as all receivers inevitably start to drift in time as per the drift of their oscillators, including the transmitter which also drifts, especially when using low-cost oscillators with horrible drift.