I just came across this discussion and figured I'd weigh in. I've been using Davis Instruments products ever since I purchased a Weather Monitor II over 20 years ago. I never knew how far off the tipping bucket measurements were until ~10 years ago when I mounted a CoCoRaHs gauge on the opposite side of the pole from my Vantage Pro2 tipping bucket. I was surprised to discover how much the Davis was under-reporting. While I recognize that a tipping bucket can never be calibrated to match the CoCo in all situations, I do think it can be adjusted such that the average error over time is minimal.
I start by calibrating the tipping bucket to a known volume of water. This gets it close and ensures both sides are equally balanced. Based on my calculations, 5.4377 cc should result in one tip of the bucket. However, it's pretty tough to measure a small volume of water to a fraction of a cc. I choose to use either 54.38 cc for 0.1" or 271.89 cc for 0.5". If you don't want to stand there long enough to trickle in such a large volume of water, I discovered a very simple solution. Measure the correct volume of water and pour it into a plastic water bottle. Puncture a VERY small hole in the cap, turn the bottle upside down in the rain collector, and if necessary, poke another small hole above the water line to let air in. You can stand there and count tips with the rain collector disconnected, but with the larger volume of water you might be there awhile. Conversely, you can leave the rain collector connected so it transmits to the ISS, then walk away until the water has drained and check your console for the total. (Note: When I do it this way, I always edit the database records to remove 'false' precipitation. If you're uploading to CWOP or Weather Underground, you'll want to suspend the uploads during this process and clear the archive before restarting them. Based on the delta between the tipping bucket and the CoCo, you simply adjust the screws as necessary to account for the discrepancy (one turn for each 6%).
The next step involves comparing the two gauges over a period of time and with varying rainfall intensities. At low rainfall rates, the volumetric technique described above should result in very little error. But if you live in an area with frequent heavy thunderstorms, this technique will result in the tipping bucket under-reporting by a significant amount. I adjust mine such that the readings are fairly close during a moderate thunderstorm. This typically results in over-reporting rainfall during light rainfall events and under-reporting during heavy rainfall events. In terms of error percentage, I want the positive error during light rain to be a bit larger than the negative error during heavy rain. Why? Because my intent is for the total rainfall reported over a longer period of time to be close to correct. Here along the Gulf coast, I tend to receive more precipitation throughout the year during heavy rainfall events than I do during light rainfall events.
The bigger question here is why Davis can't do a better job of calibrating the tipping bucket to begin with. I've found them to routinely be off by at least 25%. In fact, I recently sent my ISS in for refurbishing and I had to start the calibration process all over again when it was returned. It can take quite a bit of time, and I still don't have it just right. During the recent Tax Day flood, the CoCo measured 12.4", while the Davis reported 10.87" (>12% error). I made some additional adjustments, and based on several checks since then it seems that I'm over-reporting by 2.8% (this from a mixture of light and heavy rainfall events). It looks like I need to turn 'em to the right just a bit.