From having worked a bit in the industry I'm a bit skeptical about this study, I've definitely seen studies and experiments that used different initial charging conditions that would have shown better fade performance if this was true.
Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.
Agreed, the study summary needs better explanation to justify the contradiction with dozens of other lab tests. We have several boxes of 21700 cells from various manufacturers (Samsung/Sony/Panasonic) undergoing aging trials for over 2 years now.
All LiIon and LiPol chemistries have shown the following:
1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.
2. high-current discharge or rapid-charging accelerates capacity losses by about 15% a year
3. Internal resistance goes up as dendrite defect shorts damage the cell. Additionally, the self-discharge rates increase as the cell is degraded.
Very surprising if the technique works for all cell chemistries. =3
This study solely focuses on the very first charge. It doesn't claim that recharging at high currents benefits battery life, only that the first charge at high current forms a larger protective barrier than a first charge at a low current.
Other studies have shown that a larger protective barrier improves lifespan. (See other comments on this thread for more details on the science.)
Some early Microchip LiIon chargers did not split the cycle into 3 to 4 stages (pre-prep, CC-prep, and CV rapid charge).
i.e. they would drop into a constant-voltage rapid charge mode assuming the cell was prepped already.
None of these systems showed any sort of increased capacity or longevity. Quite the contrary results, this is why the new study details are rather intriguing. =3
There is a lot of what-ifs in the poorly written press release.
In general, we track the internal resistance (+-0.02ohm for Samsung) of the inventory under both new, charged, discharging, and storage levels. While I don't doubt the Papers results, they need to be clear on the methodology so others can validate it is not BS (common in this area.)
Unless you actually work for a cell manufacturer you aren't getting completely fresh cells though. They are talking about the first charge after the jelly roll is sealed into the can. When I would build cells by hand the standard procedure was to do the first couple cycles at 0.01C, record the capacity, and then change them to the charge rate for the experiment.
Perhaps, normal practice is already usually to first cycle the cells in the CV charging region a few times to condition them for best capacity. Shipping regulations means Li cells are no longer shipped fully charged anymore even when new from an OEM.
It is fascinating news, and I look forward to more details. =3
> deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.
That is, if you do it single time you are down from 8k to 2k? Or it decreases gradually and 2k is the worst case?
Where can I read about it? Not a paper, but something more down to earth for consumers? That is, for a consumer to know how to properly maintain various devices (phone/car) for longevity?
Keep in mind that for a car, 2000 cycles is still a fair bit. My BEV has a range of 350 km fully charged (highway). At 2000 cycles that's 700000 km.
That said, I previously read studies suggesting to keep the batteries between 30-70% SOC for optimal longevity, though I imagine there's been a lot of research so it might be now outdated.
Some older Tesla Models used the Panasonic 21700 cells.
In general, "the battery is the car" for EVs... The bigger the better in my opinion, as it will last longer due to reduced stress on the cells.
Even if people are stressing the vehicle pack, they should still get 5 to 12 years out of the car. Note, some companies hide the expected range loss by over-provisioning capacity.
This is definitely true, recently faced issues with my motorbike battery and oh boy are they fragile and lose charge by themselves quickly compared to bigger car batteries.
Basically I left a few times older (10-15 years) diesel bmw standing whole winter without touching it, started always without any issue (I know not the best idea re fuel in the tank, but it worked).
I did that once to completely new motorbike (honda) with good brand battery but I didn't unplug it, and now battery is permanently damaged and loses full charge in less than 2 days to such levels that it can't start the engine even if those 2 days its completely unplugged.
Motorbike Pb AGM batteries are much different in modern vehicles. The prismatic packs often increase the plate surface area to bump cranking amp ratings. Thus, the cells design are thinner and more fragile too. We used these in some equipment at one time, for the extended temperature range.
Tip: if a Pb pack is partially discharged, it is more vulnerable to cold-weather related standby failures. Most people that own boats/heavy-equipment get a plug-in trickle-charger for Pb batteries, as the adapter also helps keep the pack slightly warmed.
The problem is there are many different types of Li cells. Some tolerate a wider Safe Operating Area for power output and temperatures.
I don't want to get into the name-and-shame game with other manufacturers. The 3 brands mentioned are generally very good quality, and if you can source new cells without counterfeit/expired nonsense... they will perform as per their app notes.
The cycle limit is a function of whether your charger IC is smart, slow-charge/low-current-discharge, and if your firm uses capacity Boosting (stress costs cycle counts.)
>Or it decreases gradually and 2k is the worst case?
Anecdotally, sensitivity seems somewhat correlated with cell use/age. The more stress, and the faster the cell degrades.
Yes, China batteries are so good the claims seem impossible... There are good manufacturers like anyplace else, but they are rightly priced accordingly. =3
This may be so in Teslas that have quite robust heat management. It definitely doesn't apply to many other brands. Anyone who knows anything about lithium batteries and sees temperatures at which fast charging is done will not believe any of the longevity claims. It is up to 55C during charging and during subsequent driving on a motorway it can take half an hour for this temperature to drop below 40C (look up BYD Seal 1000 mile challenge on YouTube for an example - all above st 9C ambient).
BYD started as a battery producer and still is one of the largest in the world - I can't imagine them not having considered proper thermal management for battery health. Especially since they have been producing electric buses since 2010, which, as utility vehicles, see way higher usage (=charges) than consumer cars.
I can imagine, and I can also imagine the buyer of those batteries stripping down "functionality" because of costs. So EV cars on the lower end could have higher peak temperature, because it's cheaper (i guess).
Is it a case of them not thinking about it, or did they think about it and figure that consumers would not pay for the increase in quality?
Plenty of engineered things could be "better" if price wasn't a concern. Most of this board can probably think back to a time where they had to leave long term value on the table because of short term costs concerns. It doesn't seem impossible to me that the engineers would leave some battery longevity (something that's hard to gauge for the consumer) on the table in pursuit of faster charging speed at lower prices (headline marketing items).
Doesn’t using an 800V architecture solve some of the heat problem? I believe currently the Koreans (Hyundai/Kia) and Porsche are the only major manufacturers using it. No surprise both can push nearly double as fast charging compared to 400v competitors.
The total pack voltage shouldn't really matter for internal heating, it's due to the cells internal resistance, higher voltage really only allow for thinner wiring than the crazy high current low voltage packs (especially important for chargers/contacts etc)
The cells need to charge whether they're in series or parallel. The efficiency they can absorb charge at high speeds without heating up is not primarily determined by how they're wired. You could wire 5,000 cells in series, charge them with 20kv at 4 amps, or wire them all in parallel, and charge with 4 volts at 20 ka. Each cell will produce the same amount of heat either way, they only charge with about 95% efficiency. Higher voltage doesn't really reduce the need for active cooling if you want to keep the cells under 40-50C.
Energy loss though resistance in the pack's internal wiring is likely a lot less than the loss due to the chemistry not being 100% efficient at absorbing (or delivering) charge without heating up. But it does allow for thinner wires to get max power out of the battery.
But I'm pretty sure the voltage seen by each individual battery is always the same, regardless of the distribution system voltage.
There should be less heating for higher voltages but if most if the heating is in the battery vs the distribution system then the higher voltages will not help much.
Also, if they make all wires smaller to save money and weight then there might not be any change in heating.
The current per cell is still the same. 800V charging just means that you put cells in series banks to achieve an ~800V module-level voltage. Current is reduced in the main charging cables, charge port, and pack fuse/contactor, but not in the individual cells.
And because of this, electric vehicle manufacturers should take note.
If only for a city only car that you mostly charge at home, don't do roadtrips with multiple fast charges in short period of time you may get away with passive cooling. And if you don't live in a hot climate. But those are too many IFs.
That describes our use case pretty well (for a 2-car household) and we’ve been quite happy with the 26K miles we put on our Nissan LEAF in MA over a coming up on 10 year period.
Charged at home >50% of the time and pre-pandemic on the 6.6kW chargers at work. I can only recall one attempt we made at a beyond single battery trip, using an EVGo DC charger at the mid-point. I can say it worked, but subsequent trips to that same location were in the ICE car, so take of that what you will.
The car is now 80+% charged at home and is a city/nearby suburb runabout (and used for more trips, albeit not more miles, than the other ICE/hybrid).
It still has about 85% of its original battery capacity, which means we charge it about once a week, which works just fine for us.
That also works for me for the second car. I also am awaiting Leaf delivery with 27k km on odo. however I did not expect battery to loose 15% of its capacity over 26k miles (which is 42kkm)
It is healthy to know how to maintain car battery. I will probably charge the battery to ~80% except when I need more range.
It’s also 10 years, which is a factor in degradation as well, not just cycles or distance.
It’s down 1 bar (of 12) and that was 2 years ago, so the 85% is estimated, but is within -0% to +4%. I have a LEAF Spy but haven’t checked it a long time.
Wasn't that how the Nissan Leaf's used to be setup, with passive cooling? I know it greatly affected their range in warm climates. I think they now switched to active cooling.
How long does supercharging take? Even at 30 minutes, that is only a rate of 2C which is not that extreme for some cell chemistries as long as temperature is controlled.
The analogy they use in the article is all sorts of dodgy too :
> Removing more lithium ions up front is a bit like scooping water out of a full bucket before carrying it, Cui said. The extra headspace in the bucket decreases the amount of water splashing out along the way. In similar fashion, deactivating more lithium ions during SEI formation frees up headspace in the positive electrode and allows the electrode to cycle in a more efficient way, improving subsequent performance.
They'll never do it because it means decreased profits.
There are articles that appear here and elsewhere semi-frequently about how doing something simple extends battery lifetimes a huge amount, but those never get implemented in practice except perhaps for highly niche applications.
Instead what usually happens is they'll then find a way to make them last the same amount of time, but with higher energy density. The "high voltage" lion cells (>4.2V end of charge) are an example of that process; they will last much longer than previous types if charged to 4.2V, but they'd rather advertise them as 4.3 or 4.35 or even 4.4V(!) and the extra capacity that gives.
Hm this doesn't seem to be panning out in practice. Loads of devices have grown "optimize charging" style features in the recent-ish past, and those features are explicitly there to extend battery longevity (at the expense of consumer convenience even!). Clearly, the market forces are more complex than "short battery lifetime = more frequent device upgrades = profit" (although that effect is certainly *a part of& the equation).
> They'll never do it because it means decreased profits.
This is a lazy dismissal of any process or efficiency improvements.
If buyers care to pay for efficiency improvements, products with them will be more attractive to them. If they don't, they won't.
If your theory were true, we wouldn't have things like rechargeable batteries, low-energy appliances, or light bulbs that would last more than two months.
There's always some performance point when most people largely stop differentiating products based on efficiency or longevity improvements, and I'm not sure if consumer Li-I batteries are at that point yet.
or light bulbs that would last more than two months.
Read up on the Phoebus Cartel, and more recently how LED lamps which were supposed to last "almost forever" when the technology was first introduced have not lived up to expectations at all. Also, unlike incandescents, LED lamps can last much longer and be more efficient, but they are deliberately made not to --- with some very narrow exceptions: https://news.ycombinator.com/item?id=27093793
Expensive LED bulbs do live up to the expectations. They also cost about $50/each because those kinds of LED bulbs are expensive to make; it's the other electronics and parts that drive the price tags up.
Also fittings - none of my (relatively expensive) installed downlights have failed since I put them in seven years ago, partially because they’re well engineered but also since they’re installed how they’re designed to be. But I have a fitting designed for an incandescent bulb and LED replacement bulbs (even decent ones) tend to fail within six to nine months in it, because they were never designed for the heat to escape properly since the incandescent bulbs didn’t really need it. But I have other of the same bulbs in more open fittings and they last fine.
Yeah, I still have a few of the OG Philips x-prize bulbs going strong well over a decade of use. Plus a half dozen of the follow-ons that look very similar.
I don't think so. You can do your marketing so you "precondition your cells" and "have better charge and longevity with the same size and weight than competition".
I'm not into Apple, but I guess that if Apple could have chosen between that "lowering performance on iPhones when the battery capacity was decreased" shit and "precondition the cells to make them last longer", they would have chosen the second and make it very public.
A lot of energy research is speculative and it can take decades for research to go from the lab to the consumer.
This finding, however, specifically integrates with existing infrastructure; no new, unproven technology is needed, we just simply juice the batteries more during initial charge. If it pans out after extensive testing, we can see this technique hitting the market within 2 years.
> They'll never do it because it means decreased profits.
That's only true under monopoly conditions.
Fortunately, in capitalism, when there are two more more companies doing things like making phones, those companies actually compete on features. And battery longevity is absolutely a feature consumers care about.
And there's certainly no kind of monopoly conditions in cell phones. Competition is thriving. As it is in most types of portable electronics generally -- Bluetooth speakers, laptops, and so forth.
If you're the company that does it first, that means increased profits because suddenly more people buy your product. And if you're the company that does it last, it means decreased profits because less people will buy your product compared to the competition. That's the invisible hand at work.
TLDR: During a battery's initial "formation" charge, some of the lithium deactivates, forming a squishy, protective layer around the negative electrode, called the solid electrolyte interphase (SEI). Today, manufacturers typically do a slow formation charge, during which about 9% of the lithium is lost to the SEI. It was thought this was needed to form a robust layer. But the researchers found at the higher initial charge currents used in this study, 30% becomes SEI - so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles.
> so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles
If there's a capacity tradeoff, why not use a slightly modified chemistry (like how LTO is, for example)? Though I guess this article was more about the existence of the phenomenon rather than using it.
I was able to revive lithium batteries which have been discharged to much and didn’t charge by connecting them to a fully charged one for a couple of seconds.
That's all about the electronics inside the battery, rather than the chemistry. You can force feed them with any power supply, ignoring the 'standard' BMS.
Since a good SEI layer on the electrode is important, couldn't they put the layer on the electrode before assembling the battery? Then they could make the layer's shape more even.
Whats a battery lifespan? Is it capacity degradation or random failure?
If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.
There are two lifespans. The shelf life and the number of charge cycles (less of a span perhaps) where you charge to 100% and discharge to near 0. If you keep your charge/discharge to 80/20 then your battery life is limited primarily by the shelf life. eg. keep your Nissan Leaf in the 20-80% state of charge range and it will probably last 20 years, DC fast charge it every time to 100% you'll probably only get 2000 cycles (5-7 years) out of it.
It isn't that black and white, plus the Leaf without active battery temperature management isn't a representative example.
Modern Tesla's show fairly similar long-tail degradation that is nearly identical for cars that strictly home charge and those that only supercharge (based on customer vehicle tracking). Most will level off at 85-90% of original capacity.
TL;DR the high current causes a layer on the negative electron to form a bit differently (and obviously faster), previously it was thought that a slower initial charge led to better formation. This is a process tweak incremental improvement, not anything truly fundamental.
> This is a process tweak incremental improvement, not anything truly fundamental.
Regardless of whether this is a "process tweak" or something "truly fundamental", a 50% increase in battery lifespan would be huge, regardless.
The conspiracy theorist in me though thinks that a lot of consumer electronics makers wouldn't like this, because lower battery capacity has to be a big driver of upgrade cycles. I'm guessing a lot of folks are similar to me: these days, somewhere in the 2-3 year mark my cell battery capacity starts degrading noticeably. My phone otherwise works great, and I certainly don't need the features in the latest model phone, and of course I know I can pay for just a battery replacement, but sometimes I think "Well, if I need to replace the battery, I might as well get a new phone - it's got <some feature that is marginally better but that I'm now convincing myself is super cool to justify my not-really-necessary upgrade purchase>".
I think with 50% more battery lifespan I would rarely, if ever, use dwindling battery capacity as an excuse for an upgrade purchase.
Sorry that I didn't jump back on to respond sooner.
Re: consumer electronics, the obvious big example is cell phones, and notably major manufacturers have started adding features to extend battery lifespan by capping charge levels. Samsung has had this for a few years, originally capped at 85% but changed in a recent update down to 80%. I believe this occurred around the same time they extended their software support to 5 years. Apple added the same thing in iOS 17 with the iPhone 15 models, but despite obviously having the hardware capability ("optimized charging") they didn't enable the feature on earlier models.
I doubt it. Most electric car manufacturers offer a 8-10 full battery warranty with their vehicles. I don’t think an extra 4-5 years is something they would risk consumer satisfaction for.
At the price of cars they have to last. If cars only lasted 1 year then they would sell a lot more cars, but they would be all bare bones and lower overall profit - many buyers would be willing to save $500 by not getting the heater option since they have to pay the entire cost of the car every year. Because the average car is 12 years old car manufactures can sell the trade in value of the car - they know someone else will buy the car when it is 3 years old and so the real cost of a brand new car is 1/3 to 1/2 the actually price.
We have been trained to think of cell phones as semi-disposable. If people had to replace their car's $10k battery as often as they replace their phone they would be dead in the water.
You can replace the battery, it’s just that they want to save as much space as possible for smaller and smaller phones. Also having to take it in to replace the battery means you’ll probably think about buying a new phone instead which is what you’re implying.
Vehicles differ greatly from consumer electronics. Batteries are usually thermally managed, way more mass to absorb the thermal changes plus charging is spread over thousands of cells, there is a reserved capacity hidden from the user...
Biggest things is that 20% degradation in a cell phone means it can't survive a whole day; whereas that difference isn't that noticeable in a vehicle, where you're not running it to 0 anyway, just charging when needed.
If you get 50% from a process tweak what else do you tweak? Have they changed the battery formula to get longevity where it is? What does that tweak due to volumetric power density? What does it do to price? Recyclability?
What nice things can you do if you take 130% longevity and something else?
As others have mentioned - features that preserve battery longevity have become common in many consumer devices, usually at the cost of current battery capacity. I'm not sure your conspirational side is all that accurate here :P
There's no conspiracy necessary here. Most people I know upgrade because they lose it, break it, or want the new camera or want a better screen or whatever. Or people who really hold on to their devices upgrade because the OS isn't getting upgraded anymore and apps won't install/update anymore. And then a lot of people who don't need any of those new things just replace the battery.
That's great that you use it as an excuse to get a new phone, but whatever small percentage of people wind up being motivated to upgrade specifically because their battery doesn't hold enough charge, then gets outweighed by people who will buy one phone over another specifically because it's supposed to maintain its capacity for more years. Capitalism at work.
Lithium deactivation is inversely proportional to capacity. We could just add extra capacity to make up for it, though. From there, the battery would maintain capacity for a longer time than before.
> We could just add extra capacity to make up for it, though.
At naive face value, "just" adding an extra 30% capacity to offset expected lithium deactivation implies proportional increases in material COGS and package mass/volume, all other factors being equal.
Unless (a) a manufacturer is optimizing for throughput; (b) production is constrained at this initial charge stage; and (c) supply substantially lags demand; this strikes me as a non-starter in most of the consumer space.
Extra 21% capacity. Current practice still burns 9%. Lithium batteries have become very cheap, and I would pay a markup for a 50% longer battery life, assuming it didn't (a) further normalize non-replaceable batteries in consumer electronics or (b) lead to even worse conditions for the quasi-slaves currently mining lithium. Unfortunately, I doubt either of those will hold.
> Extra 21% capacity. Current practice still burns 9%.
On a normalized basis, if current practice yields 91% finished capacity (i.e. 9% deactivation loss), and the new proposed process is expected to yield 70% finished capacity (i.e. 30% deactivation loss), then the question is how much initial material must the new proposed process start with to end with the equivalent finished capacity as the current practice?
There's a specialized version of this called BC/RL for battery research as well.
This particular article sits about half way the scale. This was an actual study, with actual batteries, that reportedly actually had improved life spans. So dismissing that with a hand wavy this is all just academic nonsense doesn't quite fly here. But of course from here to production is indeed quite a journey. I bet a lot of companies with active investment in battery R&D are paying attention and might be going to try to replicate the success.
Also worth noting that if you only pay attention to the stuff that is at the highest levels, you basically miss out on new things until they are old news. For example if you have been dismissing solid state batteries, you might have missed the news that they are being used in products now.
As battery innovations go, this one seems relatively trivial to implement? The bigger problem is probably shipping battery packs that are sitting fully charged for a long time before the customer gets them, depending on the chemistry.
It turns out this is about the very first charge after assembly of the cell, not regular use.
However, I doubt that this finding will be used much, except perhaps in applications like aerospace; it is in manufacturer's economic interests that their products have short lives.
Edit: looks like as usual, comments that expose the truth get buried ;-)
Just 2 paragraphs down, it's very clearly explained:
> giving batteries this first charge at unusually high currents increased their average lifespan by 50% while decreasing the initial charging time from 10 hours to just 20 minutes.
And it's possible the benefit isn't nearly as big if you don't normally take 10 hours to fully charge a battery. It was just previously assumed that slower was always better.
The amount of time it takes to charge a battery is inversely proportional to the current - there is nothing surprising about the fact that using more current charges a battery in less time.
What is surprising about this research is that one small process change (doing that initial charge with a high current instead of a low one) resulted in a 50% increase in the total lifetime of the battery. That's the part that feels like "free money" in that, if accurate, this means battery producers could produce batteries with much longer lifespans without any fundamental change to their battery architecture or chemistry.
IT is free money in another way for manufactures as well: since they battery doesn't have to sit in their factory/fixtures for as long getting that initial charge there is a lot less in process batteries in their factory, less charging fixtures and jigs to buy... This process savings is often invisible until an accountant looks close and then they discover it is massive.
If you've got your widget rolling off the production line once every 90 seconds, you only need 14 chargers to have the charging done just in time on the line itself, eliminating the "storage for distribution" step entirely.
That sounds like a minor details, but accountants keep poking at that and discovering that extra steps like that are cost a lot of money. I'm guess tens of millions more $$$ which either goes to more profit or lowering prices - either is good (profit because I may be an investor, lower prices for customers)
That's not true for passenger vehicles, particularly for high-spec products sold in the West. Integration of components such a battery cells that have many critical performance parameters is not trivial and manufacturers are not free to substitute cells from commodity markets. EV manufacturers are either making their own battery cells to their own, proprietary standards, or they secure contracts with suppliers capable of making cells with consistent performance. Any change in cell characteristics, including supposed improvements such as the one appearing in this report, must be integrated by the manufacturer and supply must be assured.
It's not really a commodity market, despite appearances and hype.
Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.