Speak EV - Electric Car Forums banner
1 - 10 of 10 Posts

·
Registered
Kia e-Niro 3
Joined
·
28 Posts
Discussion Starter · #1 ·
I suspect this may be a stupid question, but I've not been able to find an answer by googling..

Regarding the reduction in charging performance as a battery becomes more full: Is this only a speed issue, or is it also an efficiency issue?

In other words, is it more expensive to charge from 80-100 than say 60-80? or is it drawing less current and just taking longer to add the same kWh?

I suspect the latter, but would be great to hear it from someone who knows. Thanks in advance!
 

·
Registered
Joined
·
5,740 Posts
I don't think the efficiency is significantly different, it's just slower because the battery chemistry gets upset otherwise.

But if you are using a charger that has a time element (like a 45 minute limit or something) it could indeed be more expensive.
 

·
Registered
Kia e-Niro 3
Joined
·
28 Posts
Discussion Starter · #3 ·
I suppose a notable exception would be Source London that charge per minute. Presumably because their fees include parking. But anyhow, perhaps I shouldn't have said "expensive". I meant amount of kWh assuming a constant unit cost :)
 

·
Registered
Joined
·
5,062 Posts
I would imagine that a 60-80% charge will consume virtually identical qty of energy from the mains as a 70-90% one, though the 70-90 will take longer if the battery is tapering down, thx to voltage levels rising so rate-of-fill reducing. But the 80-100% may be more expensive/less efficient, as it's likely to include BMS cell balancing, which can sometimes involve bleeding-off excess voltage from some cells that are threatening to go over-voltage; all depends how this is done. But the qty of electricity lost this way is surely tiny in comparison, so this is probably just splitting hairs.
 

·
Registered
Joined
·
1,490 Posts
The individual lithium cell will charge up to about 4.2V, once this voltage is reached with the initial charging current, the current will reduce to maintain 4.2V, so it takes longer to charge the last 10% than up to 80%. The charging efficiency then needs to consider the associated electronics of the charger, so the battery may not be less efficient at accepting the charge, but the charger may be. What is generally observed in cells outside of cars, the cells warm up only towards the end of the charge, which means less energy going to battery chemistry and more energy as waste as heat.
 

·
Registered
Ioniq 38kwh 2020
Joined
·
765 Posts
The individual lithium cell will charge up to about 4.2V, once this voltage is reached with the initial charging current, the current will reduce to maintain 4.2V, so it takes longer to charge the last 10% than up to 80%. The charging efficiency then needs to consider the associated electronics of the charger, so the battery may not be less efficient at accepting the charge, but the charger may be. What is generally observed in cells outside of cars, the cells warm up only towards the end of the charge, which means less energy going to battery chemistry and more energy as waste as heat.
I've observed that NiMH cells tend to warm significantly as they reach full charge, though li-ion doesn't seem to, eg on a phone (in fact a phone gets quite warm when charging from low levels). It's less warm at the end of the charge in my experience. Probably it is due to the charge rate tapering down, but that would suggest that the early part of the charge (the faster part) wastes more energy as heat.
So less heat wasted in a slower charge (which would by default be a charge from relatively full levels eg 80%.)
Would be interesting to measure though, we know that rapid charging warms the battery (a lot), but does a 7kW charger from low SOC waste a lot of heat too, compared to say a slow 2.2kW charge on the granny charger?
When I've charged at 7kW AC the charge rate doesn't seem to taper down at all, except in literally the last minute or so. It obviously may still be adding a lot of unnecessary heat to the battery, but as this is presumably well within tolerance for battery temperature the BMS allows it, though it doesn't make it as efficient versus a slower charge rate.
Anyone got a graph of battery temps whilst charging at different rates?!
A good test would be comparing say 20%-100% on a 7kW versus a 2.2kW granny charger, and see what the total energy used is. I'd guess the 2.2kW is more efficient (except maybe at sub zero temperatures), even though it takes longer
 

·
Registered
40kW Leaf Tekna & 22kW Zoë Q210 dynamique intens
Joined
·
743 Posts
Having observed my leaf charging current it doesn’t seem to reduce the amperage significantly until it hits about 92% (on a 7Kw charger)

My understanding of this is that the BMS does this to protect the battery as it becomes saturated.

It also depends on the ambient temperature of the battery while it’s being charged, I noticed today with the temperature hovering around freezing that I didn’t have full regen available until my SOC had dropped to 70%.

On a warmer day this is available at 80% so it’s not really to do with efficiency, it’s to extend the life of your battery.
 

·
Registered
VW Passat GTE
Joined
·
676 Posts
There's a few things at play here.
  1. Power lost to keep things switched on during charging
  2. Power lost in the battery
  3. Power lost in the charger
1 should be fairly static across the charge cycle, at least until charging has finished.

2 is interesting. Think of the battery as an ideal battery with some lumped series resistance. When charging (and discharging), resistive losses are P=(I^2)R. so for a fixed loss resistance, efficiency decreases as current increases. (From what I remember, the battery resistance doesn't change a huge amount as the battery charges - I think it is highest when SOC is low, drops a bit then stays fairly flat. Don't quote me on this, it's been a while since I've looked at it, and I'm tired)

3 is also interesting. It depends very much on the charger design, but in a bog standard inductive or capacitive DC/DC, efficiency increases as current increases. A fancy DC/DC will have tricks to improve low current efficiency, but generally speaking, they all have an efficiency drop off at low currents.

So you have opposite things happening as the charging current changes:
  • High current = efficient charger, but high resistive losses to the battery.
  • Low current = inefficient charger, but low resistive losses to the battery
Which wins? My money is on the charger being the dominant factor.
 

·
Registered
Kia e-Niro 3
Joined
·
28 Posts
Discussion Starter · #9 ·
Thanks to all for the detailed replies. It seems each post becomes more technical than the last, and has become way over my head.

I just wanted to know whether to avoid unnecessary charging to full for cost reasons alone. I will take from some of the earlier comments that any losses are negligibly small and that this is predominately a matter of speed, not power consumption.
 

·
Registered
VW Passat GTE
Joined
·
676 Posts
I just wanted to know whether to avoid unnecessary charging to full for cost reasons alone. I will take from some of the earlier comments that any losses are negligibly small and that this is predominately a matter of speed, not power consumption.
Honestly, I wouldn't worry about it. You'd need a pretty lousy charger to get near the efficiency of an ICE. Just get yourself onto a decent energy tariff and enjoy the car.
 
1 - 10 of 10 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top