Solar battery and V2L

Apologies - went off thread (with my quattro II - (dual AC input)) but anyone could use some of these newly released: Victron Orion -Tr 12/48V-8A (380W) Isolated DC-DC Converter(s) to trickle charge their 48v system via V2L in winter?
Well no, unfortunately you cannot use DC-DC as V2L outputs AC.... 😉. You can of course use a 48V charger from Victron which also has CAN connectivity so it's absolutely possible to link it to a multiplus II via CERBO.
 
Last edited:
As both @wattmatters and I have repeated, forced disconnection of charge or discharge current via the MOSFET's is a final safety device that, in normal operation, should never happen. Think of that action as being like a safety belt or airbag in car. Yes, it will do its job and protect your LiFePO4's, but the situation that it triggers should ideally never happen.
I used to believe that too. But now I'm not so sure.
I came across this post explaining how the pair of MOSFET banks works:


It seems that it's possible to disable charging and discharging pretty much independently, and with relatively low loss. I'm not sure how inexpensive inverters with poor control loops will react to a BMS stopping charging; I guess it looks to the inverter that the internal resistance has suddenly risen dramatically battery terminal voltage rises suddenly), so the inverter should get the hint to stop charging. Even if the control system has significant overshoot (mine certainly do), the battery electrolytics can handle a bit of over-voltage for a few tens of seconds, and if there is a sudden load, the battery voltage should suddenly go down and the battery can quickly respond to the load.

How well this works in practice remains to be seen, though @johnb80 finds it works smoothly with his Victron system. I expect the Victron inverters to have excellent control systems though, unlike the less expensive inverters.

Comments welcome.
 
I used to believe that too. But now I'm not so sure.
I came across this post explaining how the pair of MOSFET banks works:

Nice write up, good explanation. Once we have several devices in parallel the current ratings of the system go up dramatically and power dissipation goes down.

It seems that it's possible to disable charging and discharging pretty much independently, and with relatively low loss. I'm not sure how inexpensive inverters with poor control loops will react to a BMS stopping charging;
The battery load / supply is relatively benign, no nasty inductive kicks etc. The inverters tend to use SWMPSU controller chips controlling charging current and these switch at a relatively high frequency. Due to the frequency they can respond rapidly to changing load, I doubt there will be any nasty spikes on the DC bus at either side of the switching device.

I guess it looks to the inverter that the internal resistance has suddenly risen dramatically battery terminal voltage rises suddenly), so the inverter should get the hint to stop charging. Even if the control system has significant overshoot (mine certainly do), the battery electrolytics can handle a bit of over-voltage for a few tens of seconds, and if there is a sudden load, the battery voltage should suddenly go down and the battery can quickly respond to the load.
It will be milliseconds as opposed to seconds.

How well this works in practice remains to be seen, though @johnb80 finds it works smoothly with his Victron system. I expect the Victron inverters to have excellent control systems though, unlike the less expensive inverters.

Comments welcome.
I drive system into the limits once per week to cause BMS calibration routine to run. Normal operation the batteries run 10 - 90% so the BMS doesn't always cause the processes to stop. When operating from the independent battery charger I aim for a terminal voltage circa 80% so again the BMS doesn't stop the process. I have run it to the extremes at the very start of my system for several days whilst getting the systems to run properly and not once did it give any cause for concern in temperature / voltage spikes etc.
 
The inverters tend to use SWMPSU controller chips controlling charging current and these switch at a relatively high frequency. Due to the frequency they can respond rapidly to changing load, I doubt there will be any nasty spikes on the DC bus at either side of the switching device.
They certainly can respond quickly, if the firmware is properly written. Alas, at the bottom end of the market, which is all that was affordable when I bought mine (late 2016), the PI (Proportional Integral) control loops don't account for integral wind-up, so the overshoots sometimes are in the order of 20 seconds. It's really frustrating to watch.

Alas, it hasn't bothered me enough to find and patch the firmware. I've had a few half hearted goes at finding the main ones, but 8-bit assembly language with many pushes, pops, and stack accesses make it very hard to follow. Fortunately, this 8-bit code is only the PV code in my case, but that seems to be where the worst overshoots occur.
 
They certainly can respond quickly, if the firmware is properly written. Alas, at the bottom end of the market, which is all that was affordable when I bought mine (late 2016), the PI (Proportional Integral) control loops don't account for integral wind-up, so the overshoots sometimes are in the order of 20 seconds. It's really frustrating to watch.
I was thinking more in terms of the MOSFETS, no nastys would come back to damage them.
The PI on your inverters sounds awful, poorly implemented and tuned. I spent a little time contracting to an Instrumentation and tuning company, they used a specialist piece of equipment (protuner) that monitored a process. You did a step change to the process in manual, it recorded what happened and then calculated the PID values to input to the control system. Running the protuner on a cracker in an oil refinery it gave the PID values and a well experienced old school controls engineer flatly refused to put the values into the system. After a a very heated debate he was overruled by management and the result was a very stable process with three time the throughput LOL. It was amazing just how many poorly tuned systems were out there in industry wasting loads of time and energy etc.

Alas, it hasn't bothered me enough to find and patch the firmware. I've had a few half hearted goes at finding the main ones, but 8-bit assembly language with many pushes, pops, and stack accesses make it very hard to follow. Fortunately, this 8-bit code is only the PV code in my case, but that seems to be where the worst overshoots occur.
What processor? I spent a good few years writing z80 based real time machine control programs and I'm sure somewhere I have a REVAS for z80 if it would help.
 
What processor?
It's an NXP MC9S08AC48, apparently part of the HCS08 family. Not one of the 30 or so assembly languages that I've used before, so I had to learn it from scratch. It had irritating features, like the TSX (Transfer Stack to indeX register) instruction would take one off the value, so what was at say offset 6 from the stack (referenced as 6,sp), is now at 7,x! Also, x sometimes represents a 16 bit register (really h concatenated with x) and sometimes an 8 bit register (the low half of hx).

Z80 was one of my early processors, and I disassembled many Z80 programs. Ida Pro, my disassembler of choice, has a Z80 front end, though by the time I started using Ida (Pro or not), I'd stopped analysing Z80 programs.

Edit: Apologies for the thread drift. We older posters have a tendency towards nostalgia.
 
It's an NXP MC9S08AC48, apparently part of the HCS08 family. Not one of the 30 or so assembly languages that I've used before, so I had to learn it from scratch. It had irritating features, like the TSX (Transfer Stack to indeX register) instruction would take one off the value, so what was at say offset 6 from the stack (referenced as 6,sp), is now at 7,x! Also, x sometimes represents a 16 bit register (really h concatenated with x) and sometimes an 8 bit register (the low half of hx).
There are some weird instruction sets in different devices, I found it difficult to change between hardware. I did quite a bit with PIC chips the speed absolutely threw me, I struggled at the start.

Z80 was one of my early processors, and I disassembled many Z80 programs. Ida Pro, my disassembler of choice, has a Z80 front end, though by the time I started using Ida (Pro or not), I'd stopped analysing Z80 programs.
I used the hitachi 64180 z80 based processer but with built in i/o, serial, CTC etc.

Edit: Apologies for the thread drift. We older posters have a tendency towards nostalgia.
Aye, Nostalgia is not what it used to be :ROFLMAO:
 

Are you enjoying your MG4?

  • Yes

    Votes: 910 77.7%
  • I'm in the middle

    Votes: 171 14.6%
  • No

    Votes: 90 7.7%
Support us by becoming a Premium Member

Latest MG EVs video

MG Hybrid+ EVs OVER-REVVING & more owner feedback
Subscribe to our YouTube channel
Back
Top Bottom