Speak EV - Electric Car Forums banner

Car range reduced when using granny charger

6.3K views 29 replies 13 participants last post by  4EV-ER  
#1 ·
I have picked up on an interesting complaint in a Mercedes EQ forum that I am hoping can be explained in more detail by knowledgeable folk here like @Jeremy Harris 😉
Someone is finding that if they use the granny charger the actual range for the SOC is reduced and in worse case scenario the SOC shows 25% remaining but the range says 0 and the car will not run. They have experienced the issue both in an e class PHEV and an EQS.
They have got a technical answer finally from Mercedes after the local garages indicated there was no fault or issue and it is that the granny charger is delivering insufficient amps for the car system to register them during charging. It seems the typical charge received with the granny charger is 1.7 to 2kw. Mercedes apparently do not see this as a fault as the cars are designed to be used with wallboxes.
I would be interested in someone explaining to me how they think the situation arises, partner suggests it could happen as the car circuitry will not be receive sufficient power but those with more knowledge might have other thoughts.
I must admit a PEHV powered by a granny charger seems ok but to try to run an EQS with a 110 kw battery seems unrealistic but the original poster is upgrading.
His view is that Mercedes are at fault and the car should not behave like this, mine by the way is fine on a granny charger the few times I have used one. It is apparently an issue unique to Mercedes.
I also wonder if the low charge rate due to low power at the plug in site may be an issue in the way that I only ever get a max of 6.8kw normally 6.7kw from my wallbox clearly due to our variable voltage supply.
Anyway I am interested in others thoughts.
 
#2 ·
Interesting one. Depends a lot on the charger design I think. Sounds as if Mercedes have a charger design that may be more inefficient than most when there is only a limited AC supply, perhaps. Tends to be a problem with most EV chargers - they aren't that efficient when run at a low power. Not unusual to see charging losses of around 20% to 30% when there is only a limited current AC supply available to the charger.

Not sure why this should impact range, but it may be something to do with the way the battery management system (BMS) works. It's quite possible that the BMS isn't good at measuring low DC charge current, so doesn't accurately report the battery state of charge (SoC). That seems quite likely, as 10A AC from the granny lead is only about 5.8 A DC into the battery with 100% efficiency. The EQS charger may well only be around 75% efficient when running at 10 A AC input, so that means the current into the battery could be down around 4.3 A DC.

I can see a DC current sensor that has a maximum range of well over 500A (the EQS can charge at up to 207kW) struggling to accurately measure a current of less than 5A.

Not proof, largely guesswork, but then the way the BMS assesses SoC is also largely guesswork. SoC cannot be directly measured, only inferred from voltage, temperature, known charge energy and known discharge energy. If the known charge energy has a big error band, because of the limitations of DC current sensing, then it seems likely that the BMS will under-report SoC, to be on the safe side.
 
#4 · (Edited)
Sounds like bollocks to me.

The on board charger delivers voltage to the HV battery after rectifying and conditioning the electrical power through a buck-boost (type) converter that is 'in excess' of the HV battery voltage, thus creating a net current 'to' the battery.

" the granny charger is delivering insufficient amps for the car system to register them during charging " makes no sense because the BMS will test the battery pack voltage and interpret its charge state after the charging current (however small it might be) is removed.

If what is reported is true, then it is purely shit software that is responsible and someone's programmed it who hasn't much of a clue about the practical realities of BEVs.

That being said, if they are doing partial charges at 50% (say, they topped up from 40% to 'actual' 70% but it's being cautious and takes that as 55%, etc) and the BMS is programmed to disregard pack voltage changes unless some charge current threshold has been reached, that would, obviously, account for that, but that's actually true for any mid-charge top up at any charge rate, the actual SOC is unclear from battery voltage alone unless and until it's fully charged, hence you should top up fully at least occasionally.

In any case, still bad BMS software.
 
#5 ·
Sounds like bollocks to me.

The on board charger delivers voltage to the HV battery after rectifying and conditioning the electrical power through a buck-boost (type) converter that is 'in excess' of the HV battery voltage, thus creating a net current 'to' the battery.

" the granny charger is delivering insufficient amps for the car system to register them during charging " makes no sense because the BMS will test the battery pack voltage and interpret its charge state after the charging current (however small it might be) is removed.

If what is reported is true, then it is purely shit software that is responsible and someone's programmed it who hasn't much of a clue about the practical realities of BEVs.

I think it's most likely an inherent limitation in Hall effect DC current sensing, TBH. The EQS DC current sensor has a maximum range of well over 500A. When charging from the granny lead that sensor will be trying to measure around 5A. With the best will in the world it's not going to be very accurate when doing that, I think.
 
#8 ·
The limited pack breakdown videos I've watched all had Hall current sensors. I think the problem with using a shunt is that it's another big source of unwanted heat when rapid charging, or when driving.

A 0.001 ohm shunt is going to dissipate about 250 watts at 500A. At 5A a 0.001 ohm shunt is going to give a voltage to the A to D of 5mV. The battery compartment and BMS is an electrically noisy environment, with hundreds of amps flowing through the switched mode inverter, as well as the ancillary inverters for things like aircon. All of this noise may appear on the shunt, as there's no galvanic isolation. To get 5% accuracy of current measurement when the battery current is 5A means resolving to 250µV. I doubt this is achievable in that environment, TBH.

Increasing the shunt resistance to get better low end resolution makes the heat dissipation even higher. With a Hall sensor the heat dissipation problem goes away, as does a lot of the interference, as a Hall sensor is inherently galvanically isolated.
 
#9 ·
...it is that the granny charger is delivering insufficient amps for the car system to register them during charging. It seems the typical charge received with the granny charger is 1.7 to 2kw. Mercedes apparently do not see this as a fault as the cars are designed to be used with wallboxes...
Surely if the granny charger is is delivering a charge rate too low to be registered by the the car then the battery would actually be receiving additional unrecorded charge and would therefore give a greater range than predicted?
 
#10 ·
I think the problem may be that the car chooses to under report available range, as the BMS doesn't have an accurate record of the received charge. You're right in that the true range may well be higher, but all the driver can see is the displayed range.

This may well be an issue with other cars as well. The only time I've come close to running the battery flat was a trip to my uncles funeral and back in February. I started out thinking I'd do a "splash and dash" at a rapid about 50 miles from home on the way back, as it was debatable if I had enough range to get there and back. On the outward trip the remaining range dropped inline with the distance covered, as it did for the first half of the drive home. By the time I was a bit over halfway home the remaining range started to increase. I got to the planned "splash and dash" charger with way more range than I'd expected, so didn't bother to stop. Instead of arriving home with the battery nearly flat I got home with about 20% to spare. I can only assume that the car/BMS thought the battery didn't have as much capacity as it really had. It was freezing cold, too, something that usually hits range pretty hard.
 
#12 ·
It could also be completely unrelated; sometimes BMSes make miscalculations. Bjorn discovered this the hard way with his Model X - he had 8 miles range left but the car shut down early.

Obviously EV manufacturers take efforts to avoid this, but it can happen on petrol cars too. The fuel gauge can be inaccurate - my dad had the unfortunate experience with a rental Vauxhall while his car was in the accident workshop - it died with 1/2 tank remaining due to a miscalibrated fuel sender.
 
#15 ·
Well, it does charge, but the guessometers are just reading bollocks. That's not exactly 'unknown' in the world of BEVs.
 
  • Like
Reactions: Jeremy Harris
#16 · (Edited)
Wonder how LFP batteries cope when they get older as if understood correctly the BMS has even harder time to keep track of things on those.

Our Zoe seems to loose 10% right after charge. But I believe it's due to old BMS software. There is an update but haven't bother with it since it's out of warranty. Just charge it a bit higher SoC. Sometimes it may give 10% back too. Not likely to run it below 20% too often.

Oh and it will only charge 1,7kW at 10A (max 2,3). Of that roughly 1,2kW will end up in the battery. So needless to say 6A is no go with this car. With the same Granny the Megane will charge at 2-2,1kW and much more efficient. But the Zoe will charge 43kW as the Megane will only 22kW. So probably the charger desing is just inefficient at low speeds as it will get much better at 3-phase speeds.
 
#19 ·
I doubt theres any sort of general issue here. The J1772 charging standard allows currents right down to 6A at 110v and one would hope that its been sufficiently tested on most EV's.

Wether this is a general problem with the EQS or just some quirk with this one specific car remains to be seen.
 
#20 ·
It's not an issue with IEC61851 (J1722 doesn't apply in the UK or Europe - it's a US standard). IEC61851 refers to the charge point current advertisement and the way that the charger (in the car) should respond to that. It doesn't define charger efficiency, but leaves that up to the EV manufacturer.

It's a hard fact that chargers are less efficient when running at the IEC61851 minimum of 6A AC input current than they are when charging at a higher AC input current. This difference can be large for some cars, for example the Renault Zoe charger has losses of around 30% when charging at 6A AC input. Those losses reduce to less than 10% when charging at 32A AC input current.
 
#21 ·
I'm not arguing about efficiency, what you say is ofcourse correct. I'm saying the charging standard allows 6A charging, and the car and charger should be tested and functional at those levels. Thousands of people have charged at 6A without the car getting as broken as the one described in the OP.

Thus its either something specific to that car, or EQS's in general. Not something that will affect all EVs that happen to be granny charged
 
#22 ·
Charger efficiency is, in part, key to why this problem exists though. It seems most likely to be related to the dynamic range of the battery DC current sensor, compounded by the approach the manufacturer has taken when designing the BMS guessometer code.

As mentioned earlier, the EQS can charge at around 528A DC or more. The DC current sensor has to be able to measure at least this much current accurately. When charging from AC at 6A, allowing for the charger inefficiency when running at 6A AC input, the DC current sensor needs to measure less than 2.5A accurately. That is a very big ask for a sensor that can probably handle around well over 600A. It's accuracy will be lower for sure at such a low current, not least because of normal electrical noise from the other electronics within the battery pack case.

My best guess (and as mentioned earlier it is a guess) is that the BMS guessometer is ignoring, or placing a low weight, on any very low DC charging current measurement, to compensate for the likely measurement inaccuracy. That's then reflected in the guessometer reporting a lower than expected range or SoC as a consequence. I believe there is a general trend for guessometers to under-read, much as there was a tendency for fuel gauges to do the same. Makes some sense to do this, for several valid reasons.
 
#25 ·
I agree, I think it comes down to whatever fail safes have been coded in with regard to an assumed battery SoC. Measuring battery SoC is more art than science.

There's no direct way to measure it, best that can be done is to estimate it. You can very roughly estimate this from mean cell off load terminal voltage, plus cell temperature, but that has a pretty big error. I tried to do this on a pack years ago and best I could get was an error of about 20%, not really good enough.

Another approach is to combine cell terminal voltage measurement with measurements of energy in versus energy out. This can improve the SoC estimate a lot, but is very dependent on the accuracy of the current sensor that measures charge and discharge DC current. Any errors in that measurement for the condition where there is a very low, long duration, DC charge current could seriously screw up the SoC estimate.
 
#26 ·
A good BMS adjusts its estimate as cell voltages fall. So it will principally rely on integrating current to estimate state of charge but if cell voltages drop below the expected voltage for that state of charge then the estimate needs to be adjusted. While it is true that lithium ion batteries tend to have quite a flat discharge curve, once state of charge falls below 25-30% the drop in cell voltage is quite noticeable.

The likely cause of failures like dying at 25% are an incorrect coulombic estimation combined with defective software which does not adjust the estimate correctly (possibly not adjusting on the low end at all, instead relying on a periodic 80%+ charge).
 
#27 ·
Maybe something else which we don't know is how often the battery is run to significantly low and then charged to absolutely full. Given that the user is only using a granny, they might be needing to top up every day, even if they've not used many percent, because a "fill" from 10% to 100% would require more than an overnight.

Each manufacturer may have chosen their own approach to the two problems of cell-balancing and battery calibration. Hearsay for Kias suggests that the cell balancing occurs when on an AC charge and allowed to reach 100% (and the power not cut whilst the car's BMS can rarely want to spend some extra minutes at the end).

The battery calibration is rumoured to require one continuous AC charge from "quite low" to 100%. The quite low is rumoured to be variously 10, 15 or 20%. I know that I achieve the <20% to 100% about once every two months. As it happens I'm charging on lamp-post chargers which give about 4-5kW.

So I would wonder if a pattern of lots of little top-ups has possibly caused the problem - never doing a significantly low to full charge.
You would have to ask: has that person tried not being so tight and sticking the car on a 7kW public charger to see if that makes a difference and if that alone doesn't work do a proper charge from ~15% to 100% which may help calibrate where "empty" and "full" are and how much charge is between the two, which can then be turned into a range

Dare I say: It's a Mercedes FFS! What sort of twonk buys a Mercedes and can't shell out for a public charge @7kW once in a while to see if the problem goes away!

I must admit a PEHV powered by a granny charger seems ok but to try to run an EQS with a 110 kw battery seems unrealistic but the original poster is upgrading.
His view is that Mercedes are at fault and the car should not behave like this, mine by the way is fine on a granny charger the few times I have used one. It is apparently an issue unique to Mercedes.
My impression is that the granny would probably be fine most of the time. It is what people might be expected to do if away from home, on holiday, whatever. I just suspect the long-term effect of drifting off-calibration.
I also wonder if the low charge rate due to low power at the plug in site may be an issue in the way that I only ever get a max of 6.8kw normally 6.7kw from my wallbox clearly due to our variable voltage supply.
Anyway I am interested in others thoughts.
I think the current limit is probably not an issue for occasional use, like your experience of occasional use.
As I said, most of my "home" charging is not home charging but on lamp posts in our terraced street.I have no issue from regularly charging at under 7kW.
 
#28 ·
...
Dare I say: It's a Mercedes FFS! What sort of twonk buys a Mercedes and can't shell out for a public charge @7kW once in a while to see if the problem goes away!
...
Why should (s)he have to? This is basic stuff that other manufacturers do get right. This sort of nonsense (the problem, not your comment particularly, although this does illustrate that it's deemed ok to have to muck about with things the manufacturer should get right) is what feeds the narrative that EVs are expensive playthings and the serious work is done by fossil fuels. This is patently not true - I cover 30,000 miles a year on long distance journeys and will never go back to ICE - but, sadly, I now have another manufacturer to add to my list to avoid due to their inability to get the basics right.

For interest, that list reads:

Kia / Hyundai - inability to manage 12V correctly
Stellantis - just generally woeful
and now...Mercedes - inability to manage SOC correctly

Perhaps I'm just grumpy, but to me cars are tools, and I expect them to work properly.