Would be interesting to compare using an actual rain calibrator. Not one fabricated with an unknown flow rate and volume
My Davis rain gauges are calibrated to + or - 1% or 0.05mm.
I first ensure both sides of a twin spoon tip evenly ie same volume of water on each side per tip. This ensures you don’t have one side taking more or less to tip, that will throw a calibration straight away.
Some comments in regard the above
- A tipping rain gauge will always have a variation due to varying flow rate as that is what rain consistently does, vary in intensity so using any sort of calibrator will only ever be valid (as such) for one particular rain rate and with rain that will never happen. This is one reason that critical tipping gauge models also collect the physical rain amount for manual comparisons as required. So it really doesn't matter if the calibrator is perfectly constant it will only ever provide a calibration at 1 and only 1 specific rain rate, which does not follow the real world.
- Claiming a gauge is calibrated to + or - 1% or 0.05mm along with rain rate variations is a little optimistic and especially for a tipper with a minimum resolution of 0.2mm, +/- 1% (0.05mm) is technically not feasible regardless of what numbers appear. The accuracy of a Davis tipping bucket gauge is around 5%
- With tipping buckets it is really not ideal to really have each individual tip identical and due to the small volume involved is quite difficult to precisely achieve especially as it is not the volume of water that triggers the tip but the weight of water which can be quite variable in itself. The typical 2 tip total or groups of 2 tip totals average out any variation between each tipper so this spending a whole heap of time trying to balance each tip exactly is rather fruitless. In fact is best for the tipper to not be exactly identical which ensures a more exact tip without stuttering