Thanks for the helpful posts.
A novice question - why would a long anemometer cable run reduce the actual wind speed I could record?
It probably has to do with the frequency of the reed switch closure at high MPH, and the impedance/transmission line characteristics of the line itself. As I recall, if you exceed the max wind speed for a given line length, the measured speed won't just level off there, it will become lower and more random...
There is internet available at the transmitter site.
How fast do you want Internet updates? How reliable is the site power? Solar/mains/back-up gennie? How often do you get up there?
And I don’t need to see the gust data in real-time, but I want peaks captured so I can review the data later. (I’m a ham, but haven’t really considered sending the data out via APRS.)
Is the ability to catch gust measurements more a function of software capabilities than hardware instrumentation?
Hardware/firmware, rather than software...
An analog system like a cup-driven generator and a voltmeter readout, will have minor lags in measuring speed due to cup/generator inertia and needle lag on the meter.
OTOH, any digital system will have some sampling period over which it measures the wind speed by counting either pulses over time, or counting cycles of an oscillator between switch closures. Most are the former. They may also have a time delay between sampling periods.
As previously noted, Davis samples and sends wind speed every 2.5 seconds, so the wind speed is "averaged" over that time.
I believe the NWS criteria for gust is the highest wind recorded in a 10 minute period. That's what most of the programs seem to do. They bang on the Davis console every 2.5 seconds or less looking for the wind speed, to extract the highest as the gust. It's also what we have our WeatherElement data hub do.