My thinking about all of this maybe we need to add dewpoint temperature when we talk about Relative Humidity being its only relative to the ambient temperature and not a good indicator of actual moisture content in the air which IMO is the issue.
100% at 32°F (0°c) for days is still very little moisture and won't oversaturate the sensor membrane is my thinking. Now take 85% at 65°+dp the sensor membrane is absorbing lots of moisture in just hours.
When I conduct a test, I am recording dew points and then deriving the correct humidity from that using a dew point calculator (see:
http://andrew.rsmas.miami.edu/bmcnoldy/Humidity.html - the underlying formula is widely used in weather software). Otherwise, if there were a difference in temperature between the test device and the VP2, the humidity wouldn't be comparable. Dew point should always be the same regardless of measured temp. *
Regardless, and very curiously, I nearly always arrive at a +6% difference. Same goes for my backup VP2 with an older SHT31 - it has a consistent +8% bias. Comparisons to the airport confirm the dewpoint errors and test device accuracy.
* NOTE: Just in case anyone is a stickler, normally I let the test device (Kestrel) acclimate until there's less than 1F difference between it and SHT31. This way even the reported humidities should match within about 1% and I can double check my calculations. I do not hold the device in my hand, but rather let it hang from a hook and orient the sensor into the wind on my covered, but otherwise open porch (this way I can measure in rain and snow too). I've compared the readings taken on the porch to those using a tripod next to the ISS and they are the same. The issue with using a tripod is sun exposure, which is ill-advised. I constructed an umbrella for it, but that doesn't do enough to reduce radiation errors and acts as a sail in the wind. The north-facing porch method nearly always matches the airport to within 1F when dew points are stable and well-mixed synoptic conditions prevail.