Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

PWM solar charge control versus MPPT

  • 29-12-2015 10:58pm
    #1
    Posts: 5,238 ✭✭✭


    MPPT seems to be a held with regard as the better of the two technologies. My thoughts are that it's more complex hence less reliable and very expensive.

    It does allow for a rough and ready specification of a system that was poorly designed in terms of it's inter-compatibility. It allows for siting of an array a very long way from the controller by reducing cable costs of the installation and it does yield a higher output in certain circumstances.

    Most PV dealers will recommend an array VOC of 150% the battery voltage, others as high as they can get away with. This can be a CYA approach. The VOC needs to be no greater than the required temperature compensated absorption voltage of the battery with a minor overage for cabling losses in a battery based system and ideally this will coincide with a Vmp the same as the battery nominal voltage.

    PV modules are current sources. Amps charge batteries not volts. Volts are just conduit. If the array is well suited to the battery nominal voltage then they will up to a point self-regulate. Increasing in voltage as the current demand is reduced in a form of "natural" absorption. This will allow for minimum intervention on behalf of charge control or charge diversion and therefore allow full array output to battery with a low impedance path. With an end of charge voltage near enough to the battery absorption voltage, it's ability to push power into a fully charged battery creating unwanted heat is impaired by a lack of potential difference.

    This throws up the question do I need overhead to equalise? It's quite a waste of power on a day to day basis to cater for this facility from PV. Equalisation is not entirely necessary in a well managed system. If the battery is Gel, AGM, VRLA or SLA. Equalisation is not recommended in the first place as the battery cannot sustain the regime due to it's lack of serviceability. With FLA a high charge setpoint 2.47V p/c and a healthy charge regime is often enough. For the once in a blue moon occasion where the battery is requiring of an EQ cycle tertiary chargers can be used.
    If you find yourself with FLA and no other charging sources then yes you need to factor this into your allowable VOC or integrate a changeover circuit to take the existing array through a changeover jumper reconfiguring to a higher series voltage for the duration you require through the same controller.

    The fact of the matter is quite often a PWM system well matched to the array can be more reliable and produce more bang for buck.

    • If you can get a >10% larger array sited locally with a PWM controller for less than MPPT, then the answer is PWM.


    • If cabling cost is prohibitive and/or the difference between an MPPT controller is less than the cost of a 10% larger array then the answer is MPPT.


    {Extract from http://www.bogartengineering.com/support/faq}

    The "good" for PWM: It is simpler and lower cost technology.
    Under some common circumstances--it can actually deliver more amps to the battery. That could be when:
    (1)days are moderate or warm, with few clouds.
    (2) batteries are charging at over 13 volts, (in a 12 battery system) which they almost always are when actually CHARGING.
    (3) Panel voltage is properly matched to the battery voltage, for example "12V" panels are being used with a 12V system.
    PWM is actually more "power efficient" than MPPT--which means less total power loss in the controller itself. So heat sinks in the design can be smaller (and less expensive). Missing in most analysis of MPPT is that there is always a conversion loss with MPPT, which tends to be higher the greater the voltage difference between battery and panels. That's why PWM can actually beat MPPT under circumstances described above.
    Some places that analyze MPPT assume that panels with 30V open circuit voltage are being used in a 12V system. Any good MPPT system will easily provide better performance in that case.
    They also may assume batteries are charging at 12 or even 11 volts, which is unrealistic. Lead acid batteries are typically below 13 volts only when discharging, or perhaps charging with very little charging current--meaning the actual potential gain in amps is not great.
    The benefit for MPPT becomes apparent if you use panels not voltage matched for the battery. If they are not, MPPT will utilize more of the potential energy of the panels. For example, if you use 24 volt panels to charge a 12 volt battery system you must use MPPT, otherwise you would be using your panels very inefficiently. If you are trying to use PWM in that case, you are misusing the PWM technology.
    Another potential benefit with MPPT is that if distance between panels and batteries is far, smaller wire can be utilized by running panels at higher voltage to the batteries. Running at twice the voltage reduces wire size to 1/4, which for a long run can be a significant saving in copper wire.
    If temperatures are low enough, the slightly less power efficiency of MPPT will be compensated by the higher panel voltages, which will result in a little more battery current. But in actual measurements we made using a commonly sold MPPT solar controller, this would occur at temperatures less than 55 F degrees (in full sun, when charging at more than 13 volts), where there is a slight advantage to MPPT in my location (Boulder Creek, near the California coast). As temperature drops below that (in full sun) MPPT will get some advantage, such as could occur at high elevations in Colorado in the winter. Potentially this would be maximum about a 2.5% improvement in amps output for every 10 degrees F lower in temperature (or 4.6% per 10 degrees C colder. I'm using data from Kyocera KD-140 panels.)
    There can be theoretically optimal situations (that I don't personally experience where I live) where MPPT could give some advantage: that is when solar current is present, but the batteries are quite low in charge--but because loads are high and even greater than the solar current the batteries are still discharging despite the solar current. Under these conditions the voltage COULD be at 12.5 volts, or even lower. Again, using data from Kyocera panels, ("Normal Operating Conditions") there is a theoretical maximum gain over PWM of 20% current assuming NO MPPT conversion loss and no voltage drop in the wires to the panels, at 20C (68F). With PWM, the voltage drop in the wires in this case would not affect the charging current. Now if in addition you lower the temperature to below freezing at 28 degrees F (while sun is shining) you might actually get up to a THEORETICAL nearly 30% gain while the batteries are discharging.
    The only REALLY BAD part of MPPT, is all the hype surrounding it--for example one manufacturer advertises "UP TO 30% OR MORE" power harvested from you panels. If you are using solar panels properly matched to the batteries, 30% ain't gonna happen unless it's EXTREMELY cold. And your batteries have to be abnormally low in charging voltage--which tends not to happen when it's cold (unless you assume the battery is still discharging while solar is happening). Virtually all the analyses I've seen touting MPPT on the Internet ignore the conversion loss, assume really cold temperatures, assume unreasonably low charging voltages, assume no voltage drop in the wires from panels to batteries, use STC conditions for the panels (that the marketing types prefer) rather than more realistic NOCT conditions, and in some cases assume panels not voltage matched to the batteries.
    The other thing that is misleading about MPPT, is that some manufacturers make meters that show both the solar current and the battery current. In almost all cases for a well designed MPPT type the battery current will be greater. The engineers making these know better, but it is implied (by marketing types?) that if you were NOT using MPPT you would be charging your batteries with only the SOLAR current that you read on their meters. That's not true, because the PWM BATTERY current should always be higher than the MPPT SOLAR current. It is the nature of the MPPT that maximum power occurs when the current is lower than the maximum, so they must operate there to get the maximum power. So to properly compare the two you need to compare MPPT with an actual PWM controller in the same circumstances.
    Finally, the reason we went to PWM is that I was anticipating that panel prices were going to drop (which they certainly have over the last 5-10 years!) and that the small advantage of MPPT (under conditions where the correct panels are used for the batteries) would not justify their additional cost and complexity. So my thinking, for more total benefit per $, put your money in an extra panel rather than a more expensive and complex technology.


Advertisement