Solar battery and V2L

For a system without closed loop communications that's correct, however charge controllers and inverters should have charge and discharge voltage limits set that are within the safety limits used by a BMS for shut off and those charge controller and inverter limits should not see individual cells exceed their upper and lower voltage limits, unless something goes wrong or it is a very badly out of balance pack.
That really isnt the case. A single pack can be out of balance by a small amount and would not result in the inverter doing anything about it, the inverter usually only sees the overall terminal voltage of the whole pack. If 3 of the cells were 50mV down on the others the inverter could well exceed individual cell voltages.

Using a BMS to regularly cut power is not a job they are designed for.
This is EXACTLY what theyre designed for

They are designed to do it as a safety backstop when charge controllers / inverters have not done their job, or some other fault occurs. The BMS should be a last line of defence, not the first.
This is not a line of defence it's control.

Ah, OK, I see what you are saying.

I was not considering that given the OP was referring to using a plug-in battery charger without the ability to set a charge voltage. IOW it was all or nothing.
I took it that the OP was wanting additional support for his house system which is what I do with mine.

I guess if that is the application then using a basic 15S charger on a 16S battery will work to provide supplemental energy supply when the home battery drops to a lower SOC.
I would use a SMPSU rather than a charger for this which simply gives a straight voltage out at the maximum wattage supplied by the V2L. By keeping the voltage below the maximum you dont have the complications of multi stage charging. On mine, the output is adjusted so that the energy is taken from the SMPSU at around 90% SOC. It's absolutely seamless and a relatively smooth transfer, it starts at around 92% and is at full power by 90%.
It's not an approach I would take however as I would want to recharge the home battery more than that as the car may not always be available, so charge the home battery up when you can.
All easily achieved.
 
Oh dear - there's quite a bit of misinformation in this thread so far, IMHO :( With regard to V2L and the earthing points - please see these postings I've made previously about that - important - matter:-

My thoughts about earthing in MG4 Discharge Cable thread

My thoughts about powering a boiler in MG4 powering other stuff (Vehicle to Load aka ‘V2L’) thread

And some feedback on other comments...
From a consumer in the UK, it might be similar, the standard is TN-C-S, or at least thats what Ive just been given from my DNO
TN-C-S is the most common earthing arrangement in the UK, unless you are rural with overhead cables, in which case TT is more likely.

which begs the question should I add the old one into the mix, given that the MG4 doesn't have one
See my other posting (link above). The issue with TN-C-S is that under failure conditions the DNO supplied earth may not be present.

Can I then take V2L power from car and just plug that into a mains socket to charge the house batteries?
Absolutely NO. That will destroy your MG's inverter immediately. @Coulomb has already said that, I know, but it is so important I wanted to state it again. I note @johnb80 said you can, but I suspect he misunderstood your point - or maybe @Coulomb and I have both misunderstood what you meant, but for the sake of anyone else reading this thread.... NEVER connect your V2L to your house mains.

Can they also be set to take v2l power back.
No - they are not bi-directional chargers.

If you mean can you use the Ginlong inverter (I'm not familiar with them)
I suspect you have ;) Ginlong Solis is the full name of Solis inverters - the 3rd largest hybrid inverter manufacturer in the the world.

connecting its AC-in port to the V2L output, and the house battery to the battery port, and the voltages are compatible (e.g. 48 V inverter and 48 V battery), and everything is configured correctly, then yes.
I'd say no to doing that. The Solis AC couple ESS inverter (such as the RAI model or similar) is a grid-tied, not off grid inverter. You should never connect the V2L or any generator in put to its AC-grid port.

Using a BMS to regularly cut power is not a job they are designed for. They are designed to do it as a safety backstop when charge controllers / inverters have not done their job, or some other fault occurs. The BMS should be a last line of defence, not the first.
(y) 100% agree. The BMS acts as a monitor device that can balance cells and will act as a safety valve if everything else goes wrong.

That really isnt the case. A single pack can be out of balance by a small amount and would not result in the inverter doing anything about it, the inverter usually only sees the overall terminal voltage of the whole pack. If 3 of the cells were 50mV down on the others the inverter could well exceed individual cell voltages.
Not in a closed loop system that @wattmatters mentioned. For the situation of cell over-voltage, what should happen on a correctly configured system is:-
a) on reaching the max desired cell voltage (typically set to between 3500mV and 3600mV (for LFP's)) the BMS should signal a 100% SOC to the inverter, which in turn should stop its charge process.
b) in the case of a catastrophic failure and the inverter continues to charge the battery pack(s), the BMS will then activate its safety mechanisms should the a cell reach the cell overvoltage protection value - typically this will be 3650mV. At that point the BMS will raise and alarm to the inverter and cut off the battery charge circuit.

I'm struggling here. Where would I plug it in?
The charger is something that would need to be wired in, with suitable fuses (as @wattmatters detailed above). And taking into consideration maximum charge current that your battery supports, as I touched on in post #8 in this thread.

Please share with us how it's asking for trouble?
Using a BMS to regularly cut power is not a job they are designed for.
This is EXACTLY what theyre designed for
I'm with @wattmatters. Sure a BMS is designed to be able to cut off charge or discharge current when certain parameters are exceeded (such as cell over/under voltage, over/under current, over/under temperature). But doing that repeatedly for a job they are not designed for will reduce the lifespan of the charge / discharge MOSFET's. AFAICR this is due to heat build up within the semiconductor when shut down during large current draw. And MOSFET's have an unfortunate tendency to fail in a closed, rather open circuit mode, IME.
 
I own a Victron based system and in theory this is easily done via the main Victron charger/inverter and the use of an additional 48V Victron charger connected to V2L and then linked to the second set of +/- poles. Both chargers are monitored and linked to the battery BMS via CAN. That way you can be sure that V2L is only used when the BMS deems it necessary to top up the home battery. Handy I suppose where you can still find free charging at supermarkets and then power your house at night - I haven't bothered so far as I cannot justify the extra cost :) ...plus coming close to that time of year when a 15.1kWh home battery will power the entire house for 2-3 days in the case of a power cut.
 
Last edited by a moderator:
Thanks yes Ginlong Solis.

I will not do anything till the inverter provider comments, and I think frankly I'll just use V2L for a supply if the mains is down to power a plug not on the house circuit.
 
Last edited by a moderator:
Absolutely NO. That will destroy your MG's inverter immediately. @Coulomb has already said that, I know, but it is so important I wanted to state it again. I note @johnb80 said you can, but I suspect he misunderstood your point - or maybe @Coulomb and I have both misunderstood what you meant, but for the sake of anyone else reading this thread.... NEVER connect your V2L to your house mains.
Absolutely agree you must not connect V2L to your house mains supply. I read the OP's post that he had an inverter battery charger and that he was asking if he could connect the grid input of the inverter to the V2L and use the battery charger function to charge his batteries which is a yes.

Not in a closed loop system that @wattmatters mentioned. For the situation of cell over-voltage, what should happen on a correctly configured system is:-
a) on reaching the max desired cell voltage (typically set to between 3500mV and 3600mV (for LFP's)) the BMS should signal a 100% SOC to the inverter, which in turn should stop its charge process.
Thats not what happens on any of the BMS's / Inverters that I have used. They have parameters for each cell and for the battery overall for over voltage / undervoltage, over current, over temp, under temp etc. In the event of an issue it will switch off the charge and / or discharge functions. It will also communicate VIA Canbus to the Victron Inverter. The Victron however doesnt have any individual cell data to react upon. It does get SOC, Voltage, Temperature overall and current. If a single cell exceeds it's parameters the charge/discharge circuits are opened. I have run 3 different BMS's on this system, they all did the same, Seplos V2, JK and now Seplos V3.

b) in the case of a catastrophic failure and the inverter continues to charge the battery pack(s), the BMS will then activate its safety mechanisms should the a cell reach the cell overvoltage protection value - typically this will be 3650mV. At that point the BMS will raise and alarm to the inverter and cut off the battery charge circuit.
It will certainly cut off the charge supply, in my case it doesnt raise any alarms, the inverter only sees 0 amps flowing.

The charger is something that would need to be wired in, with suitable fuses (as @wattmatters detailed above). And taking into consideration maximum charge current that your battery supports, as I touched on in post #8 in this thread.
Youre unlikely with a 2.4 kW V2L supply to exceed the battery charge rate.

I'm with @wattmatters. Sure a BMS is designed to be able to cut off charge or discharge current when certain parameters are exceeded (such as cell over/under voltage, over/under current, over/under temperature). But doing that repeatedly for a job they are not designed for will reduce the lifespan of the charge / discharge MOSFET's.
Thats not so. Power MOSFETS have almost 0 ohms when switched on and infinite ohms when switched off therefore the heat build up is zero. The time they do build up heat is when theyre switching between on and off, but the rapid switching that they do minimises any heat build up.

Heres the MOSFETS and Heatsink on a Seplos 200A V3 BMS, the heatsink is completely passive and even on full load doesnt exceed ambient by more than a couple of degrees

IMG_3293.JPG
IMG_3294.JPG


AFAICR this is due to heat build up within the semiconductor when shut down during large current draw. And MOSFET's have an unfortunate tendency to fail in a closed, rather open circuit mode, IME.
Power Mosfets can fail either open or closed. If they fail closed in this sort of application they end being burned open because all of the others in parallel sharing load will open putting the full load through the failed device resulting in a very clear indication which one has failed, it acts just like a fuse.

Think about Mosfets used in other fields, speed control of our EV's for example. They switch 1000's of times per second not once per charge and they go on for thousands of miles. The Mosfets are not being stressed in this relatively easy application. If the system worked as you think it does, how do you think the inverter will reduce the charge current? It does it by changing the PWM signal onto a power Mosfet thereby reducing current, why is this not 'stressing' the Mosfets?

Interesting thread though, and always good to hear alternative views / beliefs.
 
That really isnt the case. A single pack can be out of balance by a small amount and would not result in the inverter doing anything about it, the inverter usually only sees the overall terminal voltage of the whole pack. If 3 of the cells were 50mV down on the others the inverter could well exceed individual cell voltages.
The BMS's on-board balancer (if it has one) looks after that job and does so while the battery voltages are within the upper voltage safety threshold. Usually something you don't want activated until cell voltages exceed ~3.45 V.

This is EXACTLY what theyre designed for
I beg to differ. Have seen BMS blow a fet / let out the magic smoke when doing a switch off at high current.

They are a safety backstop for charge/discharge under/over voltage (cell and pack voltage), under/over current, under/over temperature.

This is not a line of defence it's control
A BMS may certainly communicate with charge controllers and inverters in a closed loop system to provide the battery data and/or instructions on what the BMS wants, e.g. it's time to stop discharging/charging or to change the charging mode.

That's not using the BMS to do the actual switching on/off of charging and discharging, that's using the BMS data to have charge controllers and inverters to do the switching job.

IMO it's poor system design if the BMS is being regularly called upon to do the on/off switching of charging or discharging.
 
The BMS's on-board balancer (if it has one) looks after that job and does so while the battery voltages are within the upper voltage safety threshold. Usually something you don't want activated until cell voltages exceed ~3.45 V.
Agreed, most of the decent BMS's now incorporate active balancing as opposed to passive balancing of the earlier and/or poorer quality units. It makes a massive difference.

I beg to differ. Have seen BMS blow a fet / let out the magic smoke when doing a switch off at high current.
Probably for the reasons I outlined above i.e. A single mosfet failed short circuit, the others switched off causing all of the load to pass through the mosfet that was short circuit causing loss of smoke etc. This doesnt mean switching off at high current caused the failure. Furthermore, as stated earlier, MOSFETS are designed to switch and do so at high current and high frequency. I have seen MOSFET's work for years under high voltage, high current and high frequency applications with absolute reliability so IMHO your reasoning is wrong.

They are a safety backstop for charge/discharge under/over voltage (cell and pack voltage), under/over current, under/over temperature.
I could agree with you and then we'd both be wrong ;)

A BMS may certainly communicate with charge controllers and inverters in a closed loop system to provide the battery data and/or instructions on what the BMS wants, e.g. it's time to stop discharging/charging or to change the charging mode.
Of course, I have never suggested otherwise.

That's not using the BMS to do the actual switching on/off of charging and discharging, that's using the BMS data to have charge controllers and inverters to do the switching job.
Agreed BUT if a couple of cells hit the limits the BMS will be opening the charge / discharge circuits to stop operation, exactly what it's supposed to do. It will not message the Victron Inverter to say cell 4 and 5 are at maximum, please stop charging, it simply opens the charge circuit.

IMO it's poor system design if the BMS is being regularly called upon to do the on/off switching of charging or discharging.
Duly noted, IMO it's a very poor system design / overview / understanding when it's claimed mosfets shouldnt switch under load due to it reeducing their lifespan.

I suspect were never going to agree on this particular topic so I will now cease any further responses. I have enjoyed debating this with you all but now need to go and check the smoke levels in my Mosfets.

J
 
Thats not what happens on any of the BMS's / Inverters that I have used.
Then it would appear that none of the systems you have used are operating correctly.

They have parameters for each cell and for the battery overall for over voltage / undervoltage, over current, over temp, under temp etc. In the event of an issue it will switch off the charge and / or discharge functions.
That is correct, but unless something has gone wrong you should not be having issues. As both @wattmatters and I have repeated, forced disconnection of charge or discharge current via the MOSFET's is a final safety device that, in normal operation, should never happen. Think of that action as being like a safety belt or airbag in car. Yes, it will do its job and protect your LiFePO4's, but the situation that it triggers should ideally never happen.

It will also communicate VIA Canbus to the Victron Inverter. The Victron however doesnt have any individual cell data to react upon.
Inverters don't generally receive or act upon that detail and I never said they did. What I said was "on a correctly configured system.... then on reaching the max desired cell voltage (typically set to between 3500mV and 3600mV (for LFP's)) the BMS should signal a 100% SOC to the inverter, which in turn should stop its charge process."

If a single cell exceeds it's parameters the charge/discharge circuits are opened.
Which parameters? BMS will typically have multiple parameters. From a mixture of coulomb counting, individual cell voltage and pack voltage the BMS will send 'control' information to the inverter to reduce and stop charging once the cells are fully charged (according to voltages set). As I mentioned above this might be set to be 3550mV max per cell. Only if something has gone terribly wrong and the voltage of a cell reaches the 'overvoltage' protection value (typically set at 3650mV for a LiFePO4 cell) should the BMS cut charge current. BUT that situation should never be reached. If your BMS is cutting out due to a 'protection' event then something (probably the protection values) within your BMS are not set correctly. My BMS and system has been running for over 2 and a half years and I have never had a protection event raised.

It will certainly cut off the charge supply, in my case it doesnt raise any alarms, the inverter only sees 0 amps flowing.
It is odd that your BMS is not raising alarms to your Victron - Protection & Alarm flags should be present in CANBus message ID 0x359 (for vanilla SMA or Pylontech protocols) - maybe Victron uses a different message ID that is incompatible with your Seplos BMS?

Youre unlikely with a 2.4 kW V2L supply to exceed the battery charge rate.
On its own, obviously not. My point (a) in post #8 was with regard to adding an additional charger to the battery and not exceeding the total current being supplied to the battery pack. For example, my Solis will deliver a full 100A to my battery pack which is the maximum supported charge current of the BMS. So without restricting inverter charge rate I cannot simply add an extra 2.4kW (thought it was 2.2kW) from the V2L.

Thats not so. Power MOSFETS have almost 0 ohms when switched on and infinite ohms when switched off therefore the heat build up is zero. The time they do build up heat is when theyre switching between on and off, but the rapid switching that they do minimises any heat build up.

Heres the MOSFETS and Heatsink on a Seplos 200A V3 BMS, the heatsink is completely passive and even on full load doesnt exceed ambient by more than a couple of degrees
They are some sizeable heatsinks on the Seplos, but I disagree where you say "the heat build up is zero". The RDS(on) values for a MOSFET are likely to be in the few mOhm range. But with (say) 100A current spread across 8 or 16 MOSFET's you are likely to get several watts of power dissipation. FWIW my BMS will rise to 10 to 20 degrees above ambient after several hours of 100A charging.

Re failures of BMS MOSFET's and your comment...
Think about Mosfets used in other fields, speed control of our EV's for example.
The view I take is that it all comes down to the design of the circuit and choice of components, heatsinks etc. for a specific purpose. I've more of a digital rather than analogue background, but IIRC it is not the heat dissipation of the MOSFET when on (i.e. related to the the RDS(on) resistance) but the heat dissipation that happens during switching. That comes down to rise/fall time of the gate driving circuitry, which in turn is affected by gate capacitance. Hence, I'm guessing here a bit, but I suspect that because a BMS is not designed to be switching its MOSFETS on and off except in an emergency situation, it is probably not designed to minimise that occurrence which is why MOSFET failures on BMS boards are not uncommon.
 
The problem with that is that the inverter will likely rely on the AC-in port to connect the neutrals to earth. You need that connection for safety; Residual Current Devices (RCDs) won't work without this...

A few inverters have a separate generator input, distinct from the grid connection. Generators generally don't have a proper earth, especially portable ones, so the inverter probably won't rely on its neutral to earth bond. I have not looked into generator inputs, but at first glance, they would seem to be ideal. V2L is much like a portable generator.
That's exactly what i was thinking. Some hybrids have separate generator input, so that could be ideal in this case.

You wire the generator input to the V2L plug and when you need the power from the car, use that.

It's not neat, but it should work.
 
Last edited by a moderator:
Some hybrids have seperate generator input, so that could be ideal in this case.

You wire the generator input to the v2l plug and when you need the power from the car, use that.

It's not neat, but it should work.
(y) It's good if you have an inverter with generator input - mine doesn't, but the popular Sunsynk/SolArk/Deye ones do.
 
Then it would appear that none of the systems you have used are operating correctly.
In your opinion.

Inverters don't generally receive or act upon that detail and I never said they did. What I said was "on a correctly configured system.... then on reaching the max desired cell voltage (typically set to between 3500mV and 3600mV (for LFP's)) the BMS should signal a 100% SOC to the inverter, which in turn should stop its charge process."
And that would be wrong, why would the BMS report 100% SOC when it isn't?

They are some sizeable heatsinks on the Seplos,
No they're not they're tiny, that was my point.

but I disagree where you say "the heat build up is zero". The RDS(on) values for a MOSFET are likely to be in the few mOhm range. But with (say) 100A current spread across 8 or 16 MOSFET's you are likely to get several watts of power dissipation.
Actually no, wrong again. On the latest Seplos BMS, (200A) it has 14 devices on the discharge circuit and 14 on the charge circuit. The devices are MOT1113T N Channel Mosfet. So a few FACTS here, the on resistance 1.3 mOhms. 200 amps max current across 14 devices 14.28amps per device Power = 1.3mOhms x 14.28² = 0.26W - 3.71W on full load across all 14 devices. As I said creates little / no heat. The turn on / off time is 130 ns which again will produce insignificant heat. Finally these devices are rated at 500 amps continuous, 1200 amps pulsed. The bank of 14 will have a 7000 amp switching capacity you can really see how a mere 200 amps in the application doesnt cause it to break out into a sweat, well within its specification.

FWIW my BMS will rise to 10 to 20 degrees above ambient after several hours of 100A charging.
It would appear your system is poorly spec'd, incorrectly installed or faulty.

Re failures of BMS MOSFET's and your comment...

The view I take is that it all comes down to the design of the circuit and choice of components, heatsinks etc. for a specific purpose. I've more of a digital rather than analogue background, but IIRC it is not the heat dissipation of the MOSFET when on (i.e. related to the the RDS(on) resistance) but the heat dissipation that happens during switching.
Correct, the rise time and fall time, in these particular ones 130 ns, which equates to very little (read zero) heat.

That comes down to rise/fall time of the gate driving circuitry, which in turn is affected by gate capacitance. Hence, I'm guessing here a bit, but I suspect that because a BMS is not designed to be switching its MOSFETS on and off except in an emergency situation, it is probably not designed to minimise that occurrence which is why MOSFET failures on BMS boards are not uncommon.
I would suggest it is uncommon, of the 50+ systems I have been involved with none have had Mosfet failures. You have raised an interesting point though and I have emailed Seplos directly asking the question about using the BMS as a control to stop / start - charge / discharge. I have a contact in the design / engineering team there 'Alan' who is most helpful. I will share his response.


J
 
Last edited by a moderator:
Agreed BUT if a couple of cells hit the limits the BMS will be opening the charge / discharge circuits to stop operation, exactly what it's supposed to do. It will not message the Victron Inverter to say cell 4 and 5 are at maximum, please stop charging, it simply opens the charge circuit.
It won't mention cells 4 and 5, no. The inverter doesn't care about that detail. But the BMS should send a message to the inverter saying in effect stop charging, however discharging is still allowed. The RS-485 version of the Pylontech protocol has such a message.

It should send this message well before disconnecting the MOSFETs.

I've often wondered how an inverter would react when the battery is suddenly disconnected under load. At the very least, it could have trouble keeping the battery and bus voltages within limits, especially instantaneously after the cut-off. Some inverters have the ability to operate without a battery at all, but even those may assume that once a battery is detected, it can be relied on to be present, stabilising the battery bus, until next power-up.

Finally, with respect to the MOSFETs having an easy life disconnecting their rated loads. When the load has significant inductance, there will be a voltage "kick" as the inductor tries to keep the current flowing. An inverter's battery port would generally be fairly low inductance, I would think, but there will be inductance in the battery cables. The BMS obviously has no control over the battery cables, and therefore over the inductance of its load. At the hundred amp level, the magnetic field of a metre of cable is significant. I'd expect loose wires to physically jump when that level of current comes on or off, for example. I wonder about the voltage rating of the BMS MOSFETs. If some BMS use say 60V silicon to minimise cost, then in installations with longer battery cables, this inductive kick coupled with the low MOSFET voltage rating might be the cause of the more frequent MOSFET failures.
 
It won't mention cells 4 and 5, no. The inverter doesn't care about that detail. But the BMS should send a message to the inverter saying in effect stop charging, however discharging is still allowed. The RS-485 version of the Pylontech protocol has such a message.

It should send this message well before disconnecting the MOSFETs.
It may well do just that BUT what does the inverter do to stop the charging? - it turns off Mosfets which is exactly the same situation as the BMS switching off the charge ciruit which several people seem to be getting up tight about.

I've often wondered how an inverter would react when the battery is suddenly disconnected under load. At the very least, it could have trouble keeping the battery and bus voltages within limits, especially instantaneously after the cut-off. Some inverters have the ability to operate without a battery at all, but even those may assume that once a battery is detected, it can be relied on to be present, stabilising the battery bus, until next power-up.
My Victron system is battery only and I have opened the BMS discharge circuit several times and there is discernible shock or kick on the output when monitored using an oscilloscope. It just continues feeding the grid through. My Growatt system has solar and battery (Hybrid Inverter), the same check on this it gave a very minor kick on on half cycle during battery disconnection. It was only very minor and not detectable on LED lights etc.

Finally, with respect to the MOSFETs having an easy life disconnecting their rated loads. When the load has significant inductance, there will be a voltage "kick" as the inductor tries to keep the current flowing.
Agreed that the collapsing magnetic field will cause a back EMF spike but for the life of me I cant imagine much by way of an inductive load on the DC input to an inverter. The BMS however is protected against this with the diodes across the Mosfets clearly seen in the photograph in post #46.
An inverter's battery port would generally be fairly low inductance, I would think,
Agreed, just huge capacitors LOL.

but there will be inductance in the battery cables.
It's generally acknowledged that inductance in a straight cable run is insignificant due to no return path etc. The magnetic field created around it as current flows is very small unless of course we form it into a coil.

The BMS obviously has no control over the battery cables, and therefore over the inductance of its load. At the hundred amp level, the magnetic field of a metre of cable is significant. I'd expect loose wires to physically jump when that level of current comes on or off, for example. I wonder about the voltage rating of the BMS MOSFETs.
Back in my younger years working in the steelworks one of the initiations I endured as an apprentice was to be put in the middle of a cluster of water cooled cables feeding the electrodes of a 140 ton electric arc furnace. When the electrodes struck the arc it was quite a crush but this was mega amps I forget just how many but a couple of hours in there made me appreciate the strength of the magnetic field ! I cant say I've noticed any cables move when they've come on load but I'll certainly actively look for it net time I'm working on my system. Just thinking out loud, I haven't noticed the effect on automotive jump leads which can also have several hundred amps flowing.
If some BMS use say 60V silicon to minimise cost, then in installations with longer battery cables, this inductive kick coupled with the low MOSFET voltage rating might be the cause of the more frequent MOSFET failures.
The Mosfets used in my Seplos BMS have a Drain-Source Breakdown Voltage of minimum 100V Typical 190V. So at least double the supply voltage theyre switching.

J
 
It may well do just that BUT what does the inverter do to stop the charging? - it turns off Mosfets which is exactly the same situation as the BMS switching off the charge circuit which several people seem to be getting up tight about.
Values of PWM are being changed, it's really not the same. And yes those MOSFETs are switching tends of thousands of times per second, but they are in carefully controlled inductance situations, with capacitors very nearby to absorb the inevitable spikes.

Agreed that the collapsing magnetic field will cause a back EMF spike but for the life of me I cant imagine much by way of an inductive load on the DC input to an inverter. The BMS however is protected against this with the diodes across the Mosfets clearly seen in the photograph in post #46.
All MOSFETs have inherent diodes as well. But the current would have been flowing out of the battery; the discharge MOSFETs will have their diodes pointed the other way. The inductor will try to keep the current flowing in the same direction, so the kick will be negative, and won't protect the MOSFET. I hope I got that right.

It's generally acknowledged that inductance in a straight cable run is insignificant due to no return path etc. The magnetic field created around it as current flows is very small unless of course we form it into a coil.
Designers of high power converters would disagree with you. Straight wires have the least inductance, true, but not zero. Typically one nano henry per millimetre, or one micro henry per metre. A pair of one metre battery cables would be two metres worth, as the inductances would add. So with one metre cables (many would be longer), that's E = ½LI² = 0.5 x 2 x 10⁻⁶ x 10⁴ = 10⁻²J. That does sound small for a bunch of power MOSFETs, but I don't know.

Back in my younger years working in the steelworks one of the initiations I endured as an apprentice was to be put in the middle of a cluster of water cooled cables feeding the electrodes of a 140 ton electric arc furnace. When the electrodes struck the arc it was quite a crush but this was mega amps...
Heh. Some initiation. Of course, I'm not talking about crushing force, but enough to make multistranded wire visibly move.

The Mosfets used in my Seplos BMS have a Drain-Source Breakdown Voltage of minimum 100V Typical 190V. So at least double the supply voltage theyre switching.
I thought that they would have ample voltage headroom, and that explains why you don't see failures. My theory is that the ones that do fail would be ones that have far less voltage headroom. The way that MOSFETs are made, to get high current capability, you have to sacrifice voltage rating. Put another way, for a given current rating, you need to put more devices in parallel when using higher voltage parts. More parts means higher manufacturing cost. So the temptation for the lower priced BMSs is to skimp on the voltage rating. Many times, they'll get away with it, but for those unlucky customers, the smoke comes out.
 
The magnetic field created around it as current flows is very small unless of course we form it into a coil.
FWIW, the Solis inverter's battery cables are wound around a 4" diameter ferrite ring for anti RFI purposes - like this...

1739373901929.png


But, aside from any component damage concerns, I still advocate that, in a properly implemented closed-loop system, it is the BMS that should inform the inverter of the charge rate and voltage as SOC reaches 100% and once the inverter is told the SOC is 100% the inverter should stop charging. The BMS's over/under protection mechanisms should not normally be triggered unless something else has gone wrong.
 
FWIW, the Solis inverter's battery cables are wound around a 4" diameter ferrite ring for anti RFI purposes - like this...

View attachment 34742
The cables look a little on the small side for my liking.


But, aside from any component damage concerns, I still advocate that, in a properly implemented closed-loop system, it is the BMS that should inform the inverter of the charge rate and voltage as SOC reaches 100% and once the inverter is told the SOC is 100% the inverter should stop charging. The BMS's over/under protection mechanisms should not normally be triggered unless something else has gone wrong.
Yep, I do understand your stance and if that makes you happy thats good. In my system I don't charge to 100% anyway so I dont push it into the BMS halting the charge except for Saturday night. The Seplos BMS recalibrates itself by being charged to the alarm settings and being held in that state for a few minutes. Sunday is washday so I take advantage of the slight additional stored energy from the calibration. But I still maintain theres nothing wrong with the BMS halting the charge or discharge and component damage wont occur from doing so ;)

IMG_9404.JPG

Here it is, no smoke escaped from anything, no burn marks and no sign of thermo-nuclear meltdown.

J
 

Are you enjoying your MG4?

  • Yes

    Votes: 910 77.7%
  • I'm in the middle

    Votes: 171 14.6%
  • No

    Votes: 90 7.7%
Support us by becoming a Premium Member

Latest MG EVs video

MG Hybrid+ EVs OVER-REVVING & more owner feedback
Subscribe to our YouTube channel
Back
Top Bottom