Speak EV - Electric Car Forums banner

Explain the principle of voltage and current regulation when charging a battery with constant current.

14K views 29 replies 10 participants last post by  Glebiys  
#1 · (Edited)
Hello,

During my studies, I make a presentation about charging the battery of an EV

Question: Does the charging station (DC) regulate the output voltage during charging or constantly maintain it at the maximum battery voltage?

From the CHAdeMO specification, I learned that at the beginning of charging, the EV reports the current and maximum battery voltage, and during the charging process, it only reports the required current. But if you look at the SOC, you can see that there is a gradual increase in the battery voltage. Who raises it?

I looked at the principle of charging batteries, but most likely I did not quite understand the essence and therefore cannot understand this issue.

For example:
Current battery voltage: 310V
The voltage of the charged battery is 400V.

I had an idea that the station initially gives out the maximum charging voltage equal to the voltage of the charged battery (400V) and it is maintained throughout the entire charging process. I don’t know if I’m right.

On the other hand, if the station always gave out the voltage of the maximum charged battery, then why do we see its smooth growth on the SOC screen?
It turns out that even if the charging station produces 400V, then when it is connected to the battery, the total voltage will still fall to the current battery voltage and will rise smoothly?

Please point me in the right direction ;)

Thank you!
Best regards,
Alex
 
#3 ·
Image


This picture shows how the voltage of a single lithium polymer cell voltage and current can vary. Initially the current will stay stable until the voltage reaches 4.2V, then the current falls to maintain the charging voltage at 4.2V. Multiply this by how many cells to reach your pack voltage and capacity.
 
#4 ·
The chargers I've made over the years are all constant current/constant voltage. The constant voltage is set to the pack maximum terminal voltage, determined by the number of cells/cell groups in series and is the limiting shut off, in effect, where the charge terminates. Below that constant voltage, the charger operates in constant current mode, restricting current to whatever is reasonable for the chargign C rating of the pack.

For example, my electric motorcycle pack uses 16 cell groups in series, so has a maximum terminal voltage of 16 x 4.15 = 66.4 V. The constant voltage setting in the charger is set to this voltage. The cell groups have a capacity of 60 Ah, and can charge at up to 5C, but I limit the charge current to 0.5C, just to keep the size and weight of the charger reasonable. The charger has its constant current set to 30 A. When first turned on, the battery pack voltage will typically be under 60 V, below the constant voltage setting, so the charger will run in constant current mode and deliver a steady 30 A to the battery pack. As the battery pack reaches the constant voltage setting, the current starts to decrease, until at 66.4 V the current reduces to close to zero, as the pack is fully charged.

There's a bit more to it than that, as the BMS signals to the charger to reduce the charge current as soon as the first cell group reaches 4.15 V, because the BMS balance shunts can only handle a limited charge current. If the pack is perfectly balanced then all the cell groups will trigger the BMS reduced current request at the same time.
 
#6 ·
In most cases CHAdeMO works at constant current initially as the charge point cannot maximize the power that the battery can accept. However for smaller or degraded batteries, or later in the charge process, the voltage limit becomes the restriction as per @Ralkbirdy single cell example.
 
  • Like
Reactions: Glebiys
#7 · (Edited)
@farmergiles, @Ralkbirdy, @Jeremy Harris, @freddym, @dk6780,

Thank you so much for the answers!


If I get you right.

During the charging process, the station only regulates the current.
The output voltage is set only once, at the beginning of charging, it will be equal to the maximum voltage (target battery voltage) that the EV will report. Also, depending on the condition of the battery, it may be lower.
But after connecting the load in the form of a battery, we will not see this value, since the voltage will drop on the battery.
The smooth increase in voltage - which we see on the screen of the charging station and in the charge curve - is the process of charging the battery itself, no one controls it, this is the chemistry of the battery.

At the end of charging, when the voltage is almost maximum, we limit the current so that the BMS does not dissipate too much energy.


Image


UPD.
The voltmeter will likely show the average of the charging voltage and the current battery voltage.
For example: in our case (400 + 320 /2) = 360V
 
#8 ·
@farmergiles, @Ralkbirdy, @Jeremy Harris, @freddym, @dk6780,

Thank you so much for the answers!


If I get you right.

During the charging process, the station only regulates the current.
The output voltage is set only once, at the beginning of charging, it will be equal to the maximum voltage (target battery voltage) that the EV will report. Also, depending on the condition of the battery, it may be lower.
But after connecting the load in the form of a battery, we will not see this value, since the voltage will drop on the battery.
The smooth increase in voltage - which we see on the screen of the charging station and in the charge curve - is the process of charging the battery itself, no one controls it, this is the chemistry of the battery.

At the end of charging, when the voltage is almost maximum, we limit the current so that the BMS does not dissipate too much energy.


View attachment 152545

UPD.
The voltmeter will likely show the average of the charging voltage and the current battery voltage.
For example: in our case (400 + 320 /2) = 360V
No, this is not how real world EV charging works.

Don't confuse real EVs with someone's DIY efforts or internet descriptions for single cells in non automotive traction conditions.

The onboard BMS is totally in control of the entire process; it will limit and or control current AND voltage depending on the cell temperatures, cooling system temperatures, cell voltage, state of charge etc.

Most of what you have sumised is total speculation at best, realistically it's pure fiction.
 
#9 ·
For the record, the BMS has zero inherent current control capabilities in any EV. All actual control of charge current is done in the charger, and that just responds to data from the the BMS. It's a key distinction, as all the BMS does is measure parameters within the pack, like charge current, temperature, cell group terminal voltage, etc, and then pass data to the charger so that it can actually control the charge current. It's easy to see this if you strip a pack and look at the BMS as there isn't high power control stuff in there at all.
 
#14 ·
Thanks for answers! I am looking at articles, namely the university battery.

I go to school, so I don't know everything yet.

The only question is: did I understand correctly that during the entire charging period, the station produces a maximum voltage equal to the value of a fully charged battery? (target voltage obtained before charging from EV). On the voltmeter, we see the average voltage between the station and the battery voltage (current flows as the system wants to balance in voltage).
 
#16 ·
On the voltmeter, we see the average voltage between the station and the battery voltage (current flows as the system wants to balance in voltage).
It's the same. I'm pretty sure that the DC power pins are connected directly to the battery while charging (through a contactor for safety reasons.) The charger doesn't care much what the voltage is, nor does it try to define it, it just supplies the amount of current that the BMS tells it to, and the BMS will adjust that current demand to steer a safe course between charging quickly and avoiding battery damage.

There will be a very small voltage drop between one end of the cable and the other because of cable resistance, but we're not trying to measure or control that.
 
#17 ·
The voltmeter in your diagram would show 320v, and would then start rising as the battery filled up.

If you imagine the battery as a resistor, you can see that for a given resistance, and a given current you'll get a specific voltage.

In CCCV charging, the charger will set a maximum voltage, and a maximum current. It will then ramp the voltage until either the current limit is reached, or the voltage limit is reached.

With a flat battery, it'll initially operate against the current limit. Voltage will be only slightly above the open circuit voltage of the battery. As charging progresses, the voltage of the battery will climb. Eventually once the battery has reached perhaps 80 or 90% SOC, the voltage will peak at the pack limit, and the current will start dropping off. If you imagine thru this whole process, the internal resistance of the pack is increasing. Its less and less willing to accept charge, and because the voltage has stopped rising, the current drops off instead.

Lithium batteries are deemed full when the current reaches a certain lower limit.
 
#18 ·
Lithium batteries are deemed full when the current reaches a certain lower limit.
While that will happen in the circuit you described, it's better to think in terms of a maximum safe voltage per cell that must not be exceeded.
Also it gets complicated with many cells in series, because their voltages differ slightly, and when just one of them reaches that limit you can't go on charging, so in the real world your circuit would be a dangerous oversimplification unless the battery was a single cell. For EVs the BMS must measure every single cell voltage.
 
#22 · (Edited)
Based on the above discussion you're going to end up with a significant set of confusions that are going to be problematic to unpick.

In any case, you seem to be asking how the controller decide what voltage and current to generate. The answer is "it does NOT".

The voltage and current of a source power is defined not by that source power but by the load.

I'll explain, and you have to know at least something about electronics and if this still needs more explanation for you then don't try to reinterpret an explanation.

Let's say you have a 10V battery that can put out 100A. You connect this battery across a 1 ohm resistor. Do you get 100A flowing? Answer; no. What you have is 10V across 1 ohm, so 10V/1ohm = 10 Amps will flow.

Do you see that the 'source' has the following criteria; max voltage 10V, max current 100A AND internal impedance (impedance is a fancy word for resistance) = 0.1 ohm.

The only way you can get BOTH the 10V rating AND the 100A rating out of the battery is by putting a 0.1 ohm resistor across it, i.e. you are MATCHING load to source.

Now what happens when you put a 10 milliOhm resistor across that 10V battery. Well, it can only deliver 100A, so that equates to 1V, so the battery voltage is pulled down to 1V.

In this case, the voltage is referred to as having 'folded', because it cannot keep up its full 10V, it just can't do it.

(For the purposes of this explanation, our theoretical battery has zero internal resistance, but you also need to look up 'internal resistance' which applies to all circuits and batteries.)

When charging an EV battery, in fact any battery, the battery is like the resistor, it will have 'an impedance'. When the state of charge is low, its resistance is lower. When you attach a battery charger, the charger can put out a range of impedances (that is, it can vary voltage to current). If it has a FIXED impedance, it can only charge the battery up to that particular volts/current (its impedance), which wouldn't be much use.

Now, it is the BMS that controls the charging of the battery in an EV. The BMS is the fundamental controller, however there are also other controllers in the system each limiting the rate of charge. Multiple chargers, like a set of tax administrators each stamping the pay cheque you are getting and each having a right to reduce the value of the cheque. You only get out the minimum that anyone thinks you deserve. Same with all the controllers, there's one on the car and one on the charger and maybe others, in the lead and such.

Real world example; you have a hot 20% SOC battery with an internal impedance of 1.8 ohms, you plug in and the BMS on your battery sends a signal to the charger that your battery can only accept 200A due to its temperature.
The charger responds, using a 'switch mode power supply' which consists of sending packets of charge into a capacitor that is connected directly to your battery that pumps up its voltage until there is 200A flowing between it and your battery. There is no control on the voltage.
As your battery is 1.8ohm, at its current temp and SOC, we will find that the voltage is therefore 200A * 1.8 ohm = 360V.
You see that no-one is setting any voltage. The charger can actually put out 500V, but never does because the BMS will say 'woah, I am now reading too much voltage on my pack and I hereby demand ZERO current', then the charger stops.
The charger would, I am sure, also shut off if the voltage suddent surged or dropped, i.e. there would be a dV/dt threshold.

Let's say your battery cools down a bit and its impedance drops to 1.5 ohms, your BMS now says 'I can now run to 250A', the charger responds, pumping in more little packets of charge into the connection to your battery until it is 250A. Now the voltage is 250A * 1.5 ohm = 375V.

At 80% the battery impedance has steadily increased from 1.5 ohms to 1.6ohm, at which point the voltage is now 400V (still 250A, but x 1.6 ohms = 400V now).

Do you see? It is NOT the current demand that changes, that has been constant. It is NOT the voltage demand, there isn't one, it only has to be within a certain range. What changes is the impedance of the battery as it changes SOC and temperature.

The BMS on the car may be programmed now to ramp off the current, so it sends a signal to the controller for lesser current. let's say at 90% the impedance has now increased to 3 ohms (it is a sharp increase above 80%), and the BMS is then demanding 133A, which will keep the charge voltage at 400V. Again, there is no regulation BY the charger other than to create the demand current, and stay 'somewhere' within some voltage range.

It is the BMS that regulates the voltage, which it does by setting the current demand instruction to the charger.

If the charger goes mad and breaks and decides it is going to try to stuff 300A into a 1.5 ohm battery, it will generate 450V that may cause all sorts of damage. At this point (before this point) the car BMS will spot this and disconnect the relay between the charging terminals on its car and the charger. Likewise, if there is a sudden short circuit in the car and the current demand on the charger shoots up, the charger will shut down to protect itself and the car.

HTH.

Look up 'switch mode power supplies', 'battery internal impedance', 'matching DC loads and sources'

good luck.
 
#24 ·
The flying capacitor system is a brilliant idea in theory, but it would involve some interesting switching circuitry in an EV battery with hundreds of cells at potentials spanning 400V.
Maybe the batteries you've built with flying capacitor balancing worked at lower voltages. (and didn't have so many cells)
 
#25 ·
It's pretty much infinitely scalable, so shouldn't be a problem with any size pack, just keep adding modules for however many cell groups you have. The design I'm using uses optoisolation for the switching signals for the FETs, so there's no issues relating to the control circuitry supply and the total pack voltage is only limited by the isolation voltage of the optoisolators.
 
#26 ·
It's my understanding that EV's balance all the time, not just when charging. Thats the point i'm making. Their balancing systems are tiny, they have no way to dissipate huge amounts of power. But they dont need to, because they dont only balance when charging. Instead they're always working to keep things in line.
 
#27 ·
There is no one agreed method, each BMS will vary its design.

Ampera battery always balanced whenever there was current in or out of a cell, charging or in use. Early Leaf did top balancing only, AFAIR. Most early prototypes did bottom balancing, because there are clear advantages, but you have to get the battery to go flat to do that, so no use for consumer BEVs as most users don't want to do that with their EVs! Can work for PHEVs though.

Cells are so well matched now in manufacture that, ironically, more often than not it is the cell balancing circuits themselves that cause the cells to become imbalanced!