i barely get through the day with a fully charged phone, and i don't get enough runtime if my laptop is not fully charged.
so i'd like to know more about the science behind this. how much does the battery life actually improve by doing this. is the trade off worth the inconvenience of a shorter runtime before i need to recharge?
It's for people who only use like 40-50% each day. Going from 50% to 100% puts more strain on the battery than going from 30% to 80%, so limiting the total charge allows the maximum capacity to last longer (in terms of years) than otherwise.
And if you do ever need all 100%, just turn off the limit the day before.
For me it's a no-brainer: 85% on my current (brand-new) Android phone lasts around 50-55 hours.
I wish they made the batteries slightly larger, then made 100% on the battery meter secretly be 80% of the increased capacity so I could have my cake and eat it.
The Android API allows you do get the voltage and current of the battery (with negative current for discharging and positive for charging), so there are apps out there that tell you the real state of the battery when it's in that range. The phones will tell the user fully charged before it's actually completely charged, but it's not much. It's mainly so if you leave it plugged in, it can cycle a bit up and down while still displaying 100% instead of continually trickle-charging.
My guess is tablets would be similar, but I've never had one.
It's best for users that UI never displays true SoC(state of charge). They will not understand that the cameras and NFC payments don't work under true 10%, your degraded battery cannot safely reach nominal 100% voltage, etc.
Those are not pleasant facts that users need always be made aware of. 98% understood by BMS rounded up to 100%, actual 12% displayed as 1%, etc., should be tolerated.
I kinda meant on the top end (So it never charges so high that the battery experiences increased wear).
But... this happens at the bottom end too, your battery already still has some charge left at 0%, the BMS protects it from being fully discharged. I'm just wishing there was a bit more so I don't have to make decisions about wearing out the battery faster.
Then I could enable OVERCHARGE and get 120% charge for travelling or when I know it's going to be a long day, at the cost of battery life.
That's degradation rate in capacity, not overall capacity. But trading off 40% of the energy for a 25% lifetime increase is not a great trade for everyone.
You would then consume almost 2x of those cycles just recharging to make up for the lost capacity.
Limiting charging to 80% means getting rid of constant voltage charging mode altogether. Generally speaking and in illustrative numbers, Li-ion batteries are charged in following manners:
- Below 10%, battery is charged in pre-charge mode through a 10-foot pole.
- Above 10%, battery is charged in constant current mode, as in:
`charger.current_max(battery.CAPACITY * battery.CHARGE_C_RATE)`
`charger.voltage(battery.voltage + 0.0001)`
- Above 80%, battery is charged in constant voltage mode, as in:
`charger.current_max(battery.TERMINATION_CURRENT)`
`charger.voltage(battery.VOLTAGE_MAX)`
- `charger.charge_complete` shall return `True` if `battery.voltage == battery.VOLTAGE_MAX and charger.current == battery.TERMINATION_CURRENT`.
- * `battery.CHARGE_C_RATE` is generally set within 0.2 to 4; 1.0 matches mAh rating of the battery, a mostly-safe default. 0.2 is 5 hours to full charge, and 4 is "fast charge to 80% in 15 minutes".
- * all above is usually implemented in charge controller chip firmware, often in actual ROM; the bare minimum often done in real products is to hardwire indicate `charger.current_max` to the chip in circuits. That suffices unless intricate charging really is one of your differentiation.
- * `battery.state_of_charge` is more or less `find_acidity_or_pH(battery.voltage, battery.formulae)`, by the way. Capacity-Voltage curve is titration chart turned sideways.
It is often vaguely told that this constant voltage variable current mode used in 80 - 100% range is harmful to batteries, so not doing it should improve longevity. As to why then implement CV modes at all, and why not offer an option to easily disable it, I don't know. Everyone does it this way. I wish I knew if altering `battery.VOLTAGE_MAX` would work or if `charger.allow_cv_charging` can be set to `False`, but those are often hardcoded `4.2` and `True` in ROM.
It is not fundamentally possibly to simultaneously set voltage and current of the charger as they are inversely related to each other and vary as a function of charge state. As the battery gets more full, the current is forced to decrease as the battery resistance increases (or equivalently the voltage is forced to decrease, or some combination of the two). So those given parameters seem to be more estimates of the capabilities of the charger at various estimated points in the graph rather than actual operational values. Unless I am misunderstanding you?
That is principally correct, and, in fact, a fixed voltage power supply with a resistor in series just works for charging NiMH/NiCd, yes.
In Li-ion however, I think it's something like, max voltage ramp is limited even within current limit, then afterwards current is limited even below cutoff voltage, or whatever to similar effects. It's definitely more complicated than just constant max voltage fed through a 330R.
it depends on how you use your phone. when i am outside all day running errands, checking messages, calls, web searches... by the time i get home my phone is usually on the last % before shutting off.
i tend to buy cheaper older models of phones though, that may not have the strongest battery to begin with.
so i'd like to know more about the science behind this. how much does the battery life actually improve by doing this. is the trade off worth the inconvenience of a shorter runtime before i need to recharge?