What do you use to calibrate your Davis gauge? Never could get a Davis gauge working that good. I use a RW 111 gauge (and had to adjust the screws since my setup doesn’t allow the gauge to be perfectly level) and the RW gauge works like a charm.
The tipping buckets retain a little bit of water, so I start by dumping water into the gauge until the bucket tips a couple times, then drip water very slowly until it tips once more. At the sound of this last clunk, I stop immediately. Then I slowly dump distilled water into the rain gauge from a volumetric flask. I have a 250ml volumetric flask, which should be .452" in a 6.553" diameter funnel. I adjust the leveling screws (making sure to keep them level with each other) until 250ml will cause 45 clunks of the bucket and still leave a few drips remaining. It takes about a teaspoon to tip the bucket, so time can be saved by pouring almost a teaspoon quickly, then pour really slowly until the tipping bucket is heard. I unplug the rain gauge to keep from sending test data to the internet. I'm not really sure why distilled water makes a difference; I get about two tips less with tap water.
I still don't understand why total rainfall amount would have any affect on accuracy. Except for the initial rainfall amount required to get the funnel and tipping bucket wet, it seems like any amount of rain could be measured accurately. What am I missing?
The effect of rain RATE on accuracy makes more sense to me. In really heavy rain, some water can fall into the full side of the bucket even after the bucket has started to tip. If the funnel is dripping (as opposed to a stream), this shouldn't happen. I think it becomes a stream at a rate of about two inches per hour. I tried dumping 250ml of water into the funnel in 37.2 seconds, and recorded 39 tips. There was some water left in the tipping bucket, so I call it 39.5. 39.5/45.2 would indicate that the gauge would record 87.4% the amount it should have. Would the the amount of water caught by the full bucket be a linear increase when compared with rain rate (after the first two inches per hour)? 39.5 tips in 37.2 seconds would be a recorded rain rate of 38.2 inches per hour (87.4% of the actual rain rate). Therefore, for my gauge, at rain rates higher than two inches per hour, one might use the formula:
C=(1+(R-2)/250)*A
R=displayed rain rate (in/hr)
To really be accurate a formula like this would need to be applied continuously. Most rain events I have witnessed start and end with much lower rain rates.
In a hypothetical rain event which the Davis recorded as .75", it is possible that half of the rain occured at a rate lower than two inches per hour, allowing the bucket to tip between drips. If the Davis recorded the other half of the rain at a rate of 20"/hr (about half as fast as my test, but really hard rain for my area), this formula would be applied to the .375" that fell fast: (1+(20-2)/250)*.375, so: 18/250=.072, so: 1.072*.375=.402, .402+.375 (the portion that fell slowly)=.777" (.027" error). Perhaps I am too quick to say "good enough".
If the funnel becomes filled with water (birds clog the outlet) and this water is allowed to drain at full speed, I've seen significant error. When such conditions have occured, I have blamed myself and accepted the fact that the rainfall was not recorded accurately.