Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why is Tesla deploying 400V superchargers when others do 800V?
7 points by iknowstuff 17 days ago | hide | past | favorite | 7 comments
Every other DCFC network is mounting massive losses and has huge reliability issues, so I’m not surprised Tesla is doing things differently, but still wondering. First 800V deployments started 5 years ago. How come Tesla never decided to future-proof their hardware?

Is the higher voltage contributing to the high fault rate of other chargers? Arcing problems?

Is it significantly more expensive to deploy, because silicon carbide or whatever?

The cybertruck just splits its 800V pack into 400V in parallel to charge off 400V. They must have anticipated that backward compatibility won’t be an issue, so they’re in no rush, but I assume they’d prefer to deploy 800V if it wasn’t for some downsides. What are they?




> How come Tesla never decided to future-proof their hardware?

They did. The V4 Supercharger pedestals they started installing last year support up to 1000V. This is also the pedestals they're selling to third parties like Ford and BP.

https://pbs.twimg.com/media/FrSEb8FWwAIUEZE?format=jpg&name=...

This is "future proofing" because they're currently installing them with the older power cabinets that don't provide that voltage. But V4 power cabinets have been spotted/leaked, so that's coming soon too. They can retrofit the existing V4 sites to support 800-1000V charging at any time.

https://pbs.twimg.com/media/FrSFAnMWwAIjQrS?format=jpg&name=...



Interesting how they mentioned that cars don't use the full charge range of batteries but smartphones do.

There's really no reason a lithium battery should ever be charged to 4.2v.

I suspect almost any consumer would prefer the capacity/lifespan tradeoff at 4.1v, but we inexplicably decided it should be 4.2v, and now it's basically impossible to lower the voltage without 3x the complexity because all the cheap ultra simple linear chips are fixed at 4.2v.

So we have tons of gadgets out there, which do not need the extra 10% capacity, they run for months with typical use anyway, being charged at 4.2v.


If you get five years of life out of a $800 phone and it's battery you're probably pretty satisfied, probably even appreciate the extra battery time.

If you get five years of life out of a $50,000 car and it's battery you're going to be really pissed, and you won't care that you got ten extra miles of range for five years


Pretty sure my phone is the most expensive thing I've ever owned, so a few extra years of battery would still be nice.

Also, predictability over time is more important than absolute performance to me. Being unreachable is scary if you're always on call for remote support, and getting stuck somewhere without being able to call a Lyft could be really bad.


Am I missing something or wouldn’t this be trivially fixed with a boost converter?


Pretty much anything will run on 4.1v that could run on 4.2, the problem is actually getting it to not charge to 4.2.

The easiest way to charge a lithium battery is with an integrated chip that doesn't have any kind of voltage settings. At the hobby level, it's basically always a TP4506, in commerical projects it's often that, but can sometimes be other chips.

Sometimes more expensive items even use more advanced ones that DO allow voltage settings, and still use 4.2v for some reason!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: