Weather Station Hardware > Remote Weather Monitoring
Lead-acid batteries - SoC vs voltage
johnd:
Does anyone here have any good insight (ie solid personal experience or access to definitive literature) on how the open-circuit voltage (OCV) of flooded deep-cycle lead-acid batteries varies with their SoC (State of Charge). I'm finding that a lot of the data on the web is either confused or very variable - even from apparently authoritative sources such as battery makers - or both.
This isn't just an academic nicety - running a remote solar-powered AWS site in more northerly parts it becomes quite important in winter to know the OCV at which the battery is starting to become seriously depleted so that the system can cut out to prevent damage to the battery.
I think there's reasonably general agreement that an OCV of around 12.7v (obviously measured when there's no solar charging happening) represents around 100% SoC. But it's at the other (0-10% SoC) end of the scale that there's a lot of disagreement. Some sources seem to think that about 11.8v is the lowest voltage that the battery should be allowed to go to, while others talk about 11.5v and even discharge down to 10.5v still allowing a deep-cycle battery to recover adequately. So it's particularly at the bottom end where any further insight would be useful.
I've currently got the cellular modem set to drop out at 12.0v and the Envoy/WLIP PSU drops at 11.5v (the idea being that the modem takes most power so that if it drops at significantly higher voltage then it will leave quite a significant reserve of battery power for the 0.5Whr consumption of the Envoy/WLIP to carry on. So logging should be able to continue uninterrupted even when the live long-distance data feed drops out.)
So the question here is whether my choice of eg 12.0v (representing I'm guessing 20-25% SoC) and 11.5v (5-10% SoC) are sensible numbers to choose or if anyone has a better suggestion?
NB This is with a leak-proof battery. It's not AGM-type but TBH I can't remember whether it's just a leak-proof enclosure or a gel type.
(Sorry if this seems rather an esoteric question, but it does actually become quite important in the middle of winter when you're trying to maintain a flow of live data against often very limited solar power generation.)
SLOweather:
Hi John,
First, I wouldn't even mess with trying to roll my own low voltage disconnect. When I was designing and building small scale solar systems for municipal utility telemetry sites, I usually included a commercial device similar to this:
Per the spec sheet:
--- Quote ---Continuously monitors voltage level of a 12 volt power input (terminal 86)
If voltage is above 13.25 - power switch will turn on to supply up to 15 amps on the output terminal to power the loads
If voltage drops below 12.8 - time delay is started
If voltage remains below 12.8 until the timer expires—power switch will turn off, disconnecting the power to the loads
If voltage drops below 11.8 with the timer running - power switch is shut off immediately
Any time input voltage increases to above 13.25—power switch will turn on to supply power to the loads and timer will reset
--- End quote ---
So, there are the voltages that the manufacturer uses.
Second, if you really want to do your own research and disconnect, I'd do an experiment.
Get a brand new gel-cell or AGM battery and fully charge it. Apply a load sufficient to draw a current equal to 1/20 or so of the rated capacity. A lamp would be fine.
Measure the current over time and do a running calculation of the amp-hours until the battery is depleted to 50%. That's my guideline for maximizing battery life and the time between battery replacements.
Measure the battery voltage at that time with the load on. Disconnect the load and wait a few minutes and remeasure the battery voltage. It might be interesting to measure the voltage recovery fairly often until it levels off and graph it.
johnd:
Chris, sorry I should have provided a few more background details in the post:
This is an installation which is already up and running.
(Actually the data is here: www.weatherlink.com/user/elydev )
Last winter there just wasn't enough solar to keep it running without drop-outs until spring, so I've made some upgrades over the summer but obviously the season of minimal solar is approaching again and so I'm aiming to monitor solar, voltages etc more closely this year.
The upgrades have been to double the panel power to 60W (battery is still around 85Ah IIRC) and to put in an MPPT solar regulator, which hopefully should help a little. (The regulator obviously provides overvoltage etc protection on its own.)
I have also already split the power supply to the Envoy/WLIP (5v derived from the 12v battery supply via a Dimension Engineering DE-SW050 high-efficiency converter) and cellular modem (12v) and run each supply through a separate adjustable voltage drop-out switch, which I managed to find available commercially and quite cheaply here in the UK. Currently as per my initial post the modem is set to drop at 12.0v (IIRC) with the logger not until about 0.5v lower (eg 11.5v). This is all installed and has been operating fine over the summer, which is good but unsurprising.
So, given the time of year, I'm just starting to look back at the details of this set-up and wondering whether I've chosen the voltage dropout points as well as I could. Of course, time will tell over the winter but any new background insights would also be a useful extra input.
The overall problem here at 52N (with often quite persistently overcast conditions in midwinter) is that to specify a solar system that could provide guaranteed power and with something in reserve too would be too costly. In dollar terms, you might be looking at having to spend say $1500 on the solar PSU to be sure of getting a continuous feed of live data back from a $500 weather station. So instead we have to try to be more clever about how to make a $500 solar PSU go as far as possible. (Of course another answer would be to add a wind turbine into the mix, but that would cost a separate/additional $500+ and would add further practical complications of its own.)
Bushman:
Why not just add batteries?
johnd:
--- Quote from: Bushman on October 26, 2011, 01:06:59 PM ---Why not just add batteries?
--- End quote ---
Yeah, it's a good point and it may well be that the 85Ah that I currently use could usefully be increased, maybe eg doubled.
But with any solar PSU like this there's a balance between generating capacity (is solar panel wattage) and storage capacity. Whatever (realistic) battery capacity you put in, it's going to run out before too long if there's not enough charging capacity. And once it's run out then a 200Ah battery is not going to reacquire charge any faster than a 100Ah one when charging capacity is limiting. Plus, good deep-cycle batteries are not cheap themselves and so that's a factor also.
But I don't disagree - the balance I currently have between solar and battery capacity may not be optimal - not easy though to work out what the optimal balance should be.
Overall, I will surprised if - for a given level of cost - it's possible to keep this system feeding live data continuously throughout the winter. But I would like (i) to keep the uptime as high as possible; and (ii) to keep the system logging even if the live data feed drops out at times. And using the optimal voltage drop-out points is going to be key to achieving this AFAICS.
Navigation
[0] Message Index
[#] Next page
Go to full version