When it comes to humidity there is nothing linear about it's relation to temp and dew point. The closer you get to 100% the less the spread is. For instance, to get from 99 to 100% humidity may be a mere increase of only 0.4F dew point with the temp constant. At 50% the spread could be 3, 4, 5F, whatever, of dew just to reach 51%. Crunching at the end of the spectrum is much more problematic.
You can use this NOAA(
*) empirical formula to get some idea of that non-linearity:
RH% = 100%*[(173 - 0.1*Ta + Tdp)/(173 + 0.9*Ta)]^8 ...an '8th-power' is a pretty steep rising curvature.
where:
RH% = relative humidity, percentage
Tdp = dew point temperature, ºF
Ta = ambient air temperature, ºF
(
*) developed by Julius F. Bosen, Office of Climatology, US Weather Bureau, 17-Nov-1958
reference: "An Approximation Formula to Compute Relative Humidity from Dry Bulb and Dew Point Temperatures," MONTHLY WEATHER REVIEW, Dec.1958, page 486.