I have been home brewing a station and while other stations and software like Weather Display will calculate the rain rate, I'm wondering if there is official guidance on how to do that?
How far back does one look to average the rainfall? 5 minutes? 10 minutes? Half an hour?
If the rain stops and is no more during this time frame, should the software routine set the rate to zero, or just let it wind down as time passes and no more rain comes into the gauge?
Is there a relatively easy way to calculate the rate? My gauge, as most do, gives a tipper pulse for every 0.01" of rain. So I get a bunch of pulses which the datalogger counts up, but I'm going to have to figure out how much rain in previous minutes has occurred. Is it possible to just figure out the time between tips and use that to figure the rate?
Dale