Even if your array for some reason discarded the first 8 seconds of the 16 second interval, half would not make sense as wind speed is not a cumulative observation (except for windrun). You would only get the average of the last sampled 8 seconds and the highest value here as gust. And then some higher or lower values could escape the detection for forming the average or the gust value (= max wind speed in the sampling interval). That this would systematically be 50% is very unlikely.
The theory I am using is the sensor is sampling the cup rotation 8 times for 8 seconds. And then zeros for the remainder of the 16 second period.
Which might look something like this:
{1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0}
Where 1 is a one second reading of the anemometer sensor, how fast it's spinning.
That's bundled up and transmitted to the receiver which I believe is taking that array, totaling up and dividing by 16. So effectively half of the averaged data would be zeros. I haven't verified that in terms of what's being sent over the air. Just thinking of how companies tend to behave with firmwares and such, where they have quirks like that across models.
I was also looking at formulas to calculate physically (by video taping) how fast the wind cups are spinning and correlating that against the data I have logged. What I might do is take the sensor down on a calm day and just drive it down the highway and see if the wind speed matches up to how fast I'm going.
I'm reasonably confident it's not a mechanical issue though. If there's an issue at all. The sensor is only a few months old and I probably would've felt that during installation.
While there are obstructions in the area. It has a good 100+ feet radius around it give or take a tree trunk. And a clear westwardly view. I just have a hard time rationalizing the wind losing that much energy. Especially compared to nearby users, one of which has a WS-5000 and that seems to report more reasonable figures.