Hacker News new | past | comments | ask | show | jobs | submit login
An efficient power converter design reduces resting power consumption by 50% (news.mit.edu)
86 points by werediver on March 16, 2017 | hide | past | favorite | 39 comments



This is progress, but in a small part of the power supply. It looks like they have a charge-pump type power supply, and they turn off the charge-level checking circuit except during polls. They also adjust the polling rate down when not much is happening. The question is how fast they can respond to a sudden demand for output power. Can a sudden load drain the capacitor before the input side notices and pumps it up again?

Excessive standby power consumption is a big problem. There are lots of devices, especially TVs, which consume far too much power when supposedly off. (Of course, some of them are constantly listening to you, phoning home, and possibly spying on you.)

One of the more important recent developments in switching power supplies is the elimination of electrolytic capacitors. They're the biggest point of failure. This is already happening in LED lightbulbs.[1][2] Electrolytic caps inside LED lamp units are the primary cause of failure. They only last about 10,000 to 20,000 hours, while high-output LEDs are good for 40,000 hours. If a power supply can be built with lower capacitance, ceramic capacitors can be used. That makes it possible to get the lifetime of all the power components above 100,000 hours, so the LEDs will burn out first.

[1] https://hub.hku.hk/bitstream/10722/164083/1/Content.pdf [2] www.rle.mit.edu/per/wp-content/uploads/2014/10/Chen-Electrolytic-Free.pdf


> Excessive standby power consumption is a big problem. There are lots of devices, especially TVs, which consume far too much power when supposedly off

Is this really a problem anymore? I have a five year old 50" plasma, and it draws 0.12 W in standby. That's not "excessive", it's a rounding error. It's 1 kWh per year, or roughly 0.6 kg of CO2; the same as idling your car for an extra 2 seconds every day.


The mean power consumption is significantly higher than the median because the worst offenders are drawing tens of watts in standby mode. For example, this 4 year old Philips TV that is at 30 watts in standby: http://www.supportforum.philips.com/en/showthread.php?15754-...

Cable/satellite boxes also frequently draw multiple watts in standby or "off". If every bit of consumer electronics were as thrifty with standby as your TV, it wouldn't be a problem anymore.

(I'll add: my iPod Touch can "wake up" instantly when I interact with it after days sitting unplugged, and that's drawing from a tiny battery. Even 120 mW is far above the lower bound of what's necessary to build a complex device with fast wakeup upon user interaction.)

(Second edit: the power converter that started this discussion is obviously targeted at much lower power applications than even an iPod, so it's not really relevant to anything that can be plugged in at home.)


Yeah, but that's just you. There were 200+ million TVs sold each year from 2011-2015.


200+ GWh/year is a rounding error on world scale, where yearly total electricity generation is at least 1e5 times larger.


Just TVs. Just one type of consumer product needs a full Power Plant added to the earth every year to support their standby draw. Yes this assumes that a significant proportion will be plugged in, but it adds up.

Just TVs, not monitors, not computers, not audio receivers, not game consoles, not microwave ovens, not etc. I doubt the cost of designing this converter even approaches the yearly cost of a power plant.


200 GWh/year is a little less than 23 MW. A full power plant is on the order of several hundred MW to around a GW. You also don't need to replace the power plant every year. I agree that standby power usage is terribly wasteful but your calculation is off by a large factor.


Given the Jevons Paradox, we can predict that the lower energy use will translate to higher demand for IoT technologies, thus increasing the world energy requirements for these devices.


> If a power supply can be built with lower capacitance, ceramic capacitors can be used.

That explains why most newer LED lamps flash at 120 Hz, slomo mode fails on cameras, and one gets wierd strobing effects with fast moving stuff...

I wish they'd just use electrolytic caps and make the light output actually constant...


> That explains why most newer LED lamps flash at 120 Hz, slomo mode fails on cameras, and one gets wierd strobing effects with fast moving stuff...

Yes. I hate it when LED light output isn't continuous or visibly flickery, especially when it's from LED lights that are on (or are visible from) moving objects -- such as LED taillights on automobiles or in-road lane lights. All the eye movement (or object movement) makes the flicker even more flagrant :(

I remember extremely flickery lane lights on the leftmost lane markers on the infamous SR 110, on the northbound direction in the tunnels (the ones right before the left-exit to I-5 north).


You may or may not need to reduce the capacitance: modern multi-layer ceramic capacitors have very high capacitance values, allowing replacement of many other types of capacitors. It used to be that a 10uF capacitor was something you'd definitely need an electrolytic for (if not a tantalum), but now you can get that in a small surface-mount MLCC ceramic capacitor. Ceramics aren't limited to the picofarad ranges any more by a long shot.


True. I have some 1uF surface mount ceramic caps just in for a project of mine. I'm tempted to use this 220uF cap [1] to replace the only electrolytic on the board.

[1] http://www.digikey.com/product-detail/en/taiyo-yuden/JMK325A...


Just be sure to read the datasheet carefully. That particular capacitor is only ~120uF at typical 3.3V DC voltage. At 5V DC (which isn't really advisable for a 6.3V rated cap) it is only 88uF.


Just watch out for leakage.


MLCC caps work for low voltages and frequencies, but experience sharp drop in capacitance when voltage or frequency are increased


Amen. Modern solid state electronic gizmos should never fail, but they frequently do because of electrolytic caps. (The other reason they fail is inadequate thermal engineering. This happens so often I'm beginning to believe in "dark pattern" thermal engineering to ensure early failure.)


> dark pattern

Just learned a new phrase today.

But, I believe you're looking for "planned obsolescence." It's common in car manufacturing.


Quote: “In the low-power regime, the way these power converters work, it’s not based on a continuous flow of energy,” Paidimarri says. “It’s based on these packets of energy. You have these switches, and an inductor, and a capacitor in the power converter, and you basically turn on and off these switches.”

To me that's a failed attempt to create a layperson-accessible explanation. In my NASA Space Shuttle work in decades past I also created efficient power supplies and I used similar methods -- but I think I can explain it more effectively.

The trick to making a modern power supply efficient is to control the phase relationship between voltage and current. This is normally performed in reactive elements like an inductor, a capacitor, or both.

Put another way, instead of changing a voltage level with a resistance (which would waste power), you need only manipulate a reactive element so the phase angle between voltage and current is something other than zero degrees. For example, at a phase angle of 90 degrees, you can have substantial voltage as well as current, but no power dissipation (except where it's needed).

That's the secret to power supply efficiency -- change the voltage without using any power-dissipating elements.


But that's not the issue at 500pA to 1mA. Buck and Boost converters are already over 90% accurate easily by modern technologies.

The issue at 500pA to 1mA is the quiescent current of the device. The researchers here have basically created a very, very low quiescent current device by turning off even more things than typical. Even the voltage-divider which detects the presence of voltages has been turned off.

So I'd expect a bad transient response and maybe a brownout if the connected devices suddenly requested a bunch of power.


Would it be more efficient to have a secondary bus in the home that provides lower voltage and current for IoT type devices? Something like the USB ports built in to the power outlet but that actually has a single converter in the home with more efficiency.


Generally this doesn't work too well because low-voltage distribution is very inefficient. Even at very modest currents, you're going to need huge conductors or else deal with lots of voltage drop.

Compare that to a typical switch-mode power supply, which is already very efficient. The quiescent current of a modern, reasonably well designed power supply is basically negligible. The problems of 'vampire draw' have basically been solved for the actual power supply (devices that don't 'sleep' are still an issue, but that's no fault of the supply).

Recently I modified some LED fairy lights with a LDR/transistor setup. The quiescent draw during the day is about a tenth of a watt, vs. 6 watts under load. Left plugged in for a year in 'off' mode, that would cost me less than 10 cents.

This 'variable clock' switch-mode supply is all about squeezing every bit of battery life you can with a device. It's about making sensors that can be deployed without a power source that last for years instead of months.


This is exactly the type of response I was hoping for, thank you. Is this a similar reason to why long distance power lines are of a much higher voltage?


> Is this a similar reason to why long distance power lines are of a much higher voltage?

Long-distance power conductors use the highest practical voltage to avoid what are called "I squared R" losses, so named because the power dissipated in a resistance (like a transmission line) is equal the the square of the current times the resistance.

For a given power level, one can obtain that same power with a very high voltage and a proportionally lower current, thus greatly reducing resistive losses in the lines. This is why a million volts is by no means out of the question for long power transmission lines.


Correct. And why AC won over DC: It's much easier to step the voltage up for power transmission.


> It's much easier to step the voltage up for power transmission.

It used to be much easier to step the voltage up with AC, but now we have advanced enough power electronics to make the inherent increased efficiency of HVDC be actually worth it.


High voltage AC switches and breakers are also cheaper to make for arc-suppression reasons.


Short answer: NO.

This question seems to come up every time there's a discussion like this about low-voltage power. The whole reason we use high voltages in the first place is to reduce resistive losses. Ohm's Law (V=IR) plus the formula for power(P=VI) tell you everything you need to know, which is that power is related to current and resistance by the equation P=R*I^2. So for a given amount of power, if you increase the voltage by a factor of 2, you then decrease the current by a factor of 2, and you reduce the resistance by a factor of 4, so the power lost in transmission is 1/4 as much.

If you tried putting 5V USB ports in your house, you wouldn't be able to use wire to connect them to the central power supply; you'd need to use massively-thick copper bus bars. The cost would be astronomical, and you'd need huge channels on the walls to hold them.


To give some context, 5V @ 1Amp takes ~12AWG, which is around the normal size of wiring installed in a typical 120v 20A circuit in the US.

Any more current than 1-2A the wiring gets silly. At 5A, which is around the power levels used for LED lighting, you need 6AWG (13 mm^2) wires.

These figures were calculated assuming an average of 40' of wire from converter to outlet. A per-room converter would be more reasonable, but you might be better off doing something like using 120vDC so every non-motor device can drop the bridge rectifier.


You could do that if you wanted (and it'll be efficient but in a very local sense).

But this article is mainly helpful for battery and mobile usage of IoT devices. At home, the biggest energy guzzlers are the components of your HVAC system, heating, air-conditioning, refrigeration, electric stove, etc. Their power consumption scale is roughly a million times that of all the IoT devices combined. You're not saving anything if you're only making IoT power consumption more efficient at home.


PoE was created exactly for this purpose! The converters aren't tiny by any means, but they can get pretty darn small if the device doesn't consume much power. Plus, its routable using existing wiring in a house, uses AC to transmit and avoid resistive losses, and is backwards compatible with existing Ethernet tech. Plus if you need internet connectivity, you only need one wire and you can cut your power consumption by ditching WiFi.


> [PoE] uses AC to transmit and avoid resistive losses

What? PoE (802.3af/at) operates at 48V DC. [0]

There is no AC being transmitted in PoE. 48V DC is used to avoid power loss due to small gauge wires in CAT5/6 cables, so they can provide power to devices up 100m as per the Ethernet specification.

[0] https://en.m.wikipedia.org/wiki/Power_over_Ethernet


Short story they measure the output voltage every once in a while, depending on the load, instead of always measuring it.

> If no device is drawing current from the converter, or if the current is going only to a simple, local circuit, the controllers might release between 1 and a couple hundred packets per second.

> .. To accommodate that range of outputs, a typical converter — even a low-power one — will simply perform 1 million voltage measurements a second; on that basis, it will release anywhere from 1 to 1 million packets.

^ Normal switched-mode converter, though a million is not realistic and they don't always work like that

Considering that today's computers power supplies work at 70kHz, a megahertz is probably too much. (granted PSUs do PWM so i may be wrong, but it depends on the size of the capacitor at the output; then again chemical capacitors are bad for efficiency and block capacitors are huge compared to their capacity so block capacitors would have to be filled more often meaning more pulses meaning more losses on the FETs but then again transistors now have much lower capacitance and resistance then ever before but .. ramble ramble ramble etc)

> Paidimarri and Chandrakasan’s converter thus features a variable clock, which can run the switch controllers at a wide range of rates.

Theirs just varies the rate of (in programmers terms) pool()ing. Like clocking down a cpu when there is no load.

(note: "packets" means pulses, see switched-mode power supply on wikipedia)


A bit longer story, they add a transistor before the voltage divider that is doing that measurement, and keep the circuit open most of the time:

> so in the MIT researchers’ chip, the divider is surrounded by a block of additional circuit elements, which grant access to the divider only for the fraction of a second that a measurement requires

That's not something you'd do in any high power PSU. It is really an IoT thing. It will decrease the full-on efficiency, while increasing the sleep efficiency.

I also don't think they are using a 1 MHz PWM anywhere. That number probably leaked into the journalist notes from some irrelevant detail.


Lets look at this practically: http://www.ti.com/product/tps62240/description

This TPS62240 has an efficiency of 95% across any load above 1mA. Below 0.1mA, its 15uA quiescent current kills its efficiency... but it still has over 70% efficiency at 0.1mA. That is, the converter uses say... 0.13mA while outputting only 0.1mA. (I mean, it really should be in terms of Watts. But since P = VxI, the mA estimation is a good enough indicator).

A single NiMH AA battery has 2000mAh of energy storage capacity, Eneloops actually are closer 2500mAh. https://www.amazon.com/Panasonic-BK-3HCCA4BA-Eneloop-Pre-Cha...

So in effect, a SINGLE AA NiMH batteries can run at 0.13mA for around 2 years. In practice, the self-leakage of these batteries are the major problem on those time periods.

At some point, reducing quiescent power consumption isn't a major problem anymore. I think the ridiculously efficient 500pA draw is unlikely to be used in any design where a AA battery is sufficient.

Smaller coin-batteries have issues ramping up to the ~50mA to transmit a radio signal. So I'd bet that in practice, AA batteries (or larger 18650 cells) will continue to be used... and therefore a hugely power-efficient design from these researchers won't have a major competitive edge.

-------

As an FYI: These efficient power converters enter "sleep" modes while the voltage output is above 1% of the voltage. Once the voltage drops to the nominal voltage, the TPS62240 "wakes up" and starts injecting charge into the capacitor / inductors... before entering another sleep cycle.

Furthermore, at higher currents, I have my doubts that a charge-pump design would be superior to the capacitor/inductor design of a typical buck (or boost) converter. So sure... this charge-pump design might be more efficient at 0.1mA, but I bet you that the capacitor/inductor is more efficient at 50mA. (The TPS62240 hitting 95% efficiency in the >1mA range)

> new power converter that maintains its efficiency at currents ranging from 500 picoamps to 1 milliamp

Hmmm... so this design really is designed for below 1mA currents. I really wonder what applications they are trying to use. Below 1mA, I say... just be wasteful. If it takes two months for a AA battery to run out of charge... is there really an issue?


There are uses for this, but not in the applications you're thinking of.

Think recovering energy from radio waves, solar powered sand-size devices, etc.


Maybe this is relevant:

https://youtu.be/1vYJq4GeXPM

Dave has recently busted another device claiming 0 standby power consumption.


Would it make sense to have two power supplies in the IoT device? One for the nanoamp draw and a second one used only when the radio is on?


What kind of stuff uses 0.7-0.9V for its power supply? Is that even enough to switch a transistor?


I recently powered an old Acer laptop on that I didn't use for a while. It informed me that "/dev/sda2 has gone 941 days without being checked", after it finished that and booting fully up (a quick affair, even on that machine with no SSD), the i3 status bar further informed me, that the battery still holds a 57 % charge :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: