This forum might not be the best place to ask this, but it's directly related to Davis stations and I'm hoping that weather experts/meteorologists might give some hints or a common, working solution.

So, I'm using

my own receiver with WeeWx with my VP2 ISS and anemometer transmitter kit. I also have Davis Weather Envoy. I observed that the T/H sensor inside my small sensor box reads about 2.22°C (4°F) higher than the internal temp of the Envoy, and RH reads 4..8% lower. I applied a simple -2.22°C (-4°F) offset to temp and they're now are 0.1°C apart at most, any time. Good, we're within sensor accuracy and even better. However, applying a simple positive offset to the RH doesn't work, the difference fluctuates too much to be able call it an accurate value. It's closer to reality but too far yet.

Looking for a solution, I made an attempt co actually calculate the RH from the offsetted temperature (Tnew) and a dew point calculated from the original temperature (T). In short, the algorithm:

1. calculate Tnew from the original temp T by adding the offset

2. calculate dew point DP using T

3. calculate adjusted RH RHnew using Tnew and DP

The exact calculation method is detailed

on this page.

Now RHnew follows the Envoy's RH value within 0..2%. This should be acceptable since both sensors have a datasheet accuracy of ±3%, but worst case the error could be ±5%. While this still looks somewhat acceptable, I have a limited way to test it in indoors environment, where temperature swings aren't too great, at most 10°C or less. So I'm not 100% convinced this is the Right Way.

Any hints?