Hello to each of you who have read and/or replied...I wrote the original statement.
I would like to make one quick statement and then a few views on the replies...
First of all...what a great group of people to be in such a discussion with...and if I might make a simple compliment - very smart people.
Ok...
to "Old Tele man...you have an amazing understanding of the basic physics and mathematics of the problem - thank you!
If I might add to your statements - I am trying to say that each "thermometer" (temp sensor) must include its radiation shielding as a part of the sensor. The shielding is critical because air temp is a shaded air temp and NOT a sun heated air temp. So the shielding becomes part of the temp measurement. BUT... a highly shielded temp sensor is really just reading the inside volume temperature of the shield cavity and is NOT reading the the true outside air temperature...real-time. So the radiation shielding slows the temp reading until the shield air temp catches up with the real air temp. This all causes a thermal delay or "thermal lag" as Old Tele man put it. For me this lag can be a significant problem. The lag, in my opinion could be as long as 30 minutes on a warm calm day.
But...the high for the day might only last for 10 minutes or less!!! So it is likely that high performance shields do an excellent job of giving real air temp but can be 30 minutes behind the real air temperature. Good, non-aspirated radiation shielding can even miss the high or low for the day if they happen quickly. The same is true for any critical temperature such as the freezing point, with which the vineyards in my area are so very concerned.
So how do we get a better real-time air temp reading? A simple fan placed in such a way that it draws in outside air and passes the outside air over the sensor. The real critical question "Old Tele man" asked is what air flow is needed? My answer to that, for me at this point in my knowledge of the problem, is the more airflow the better - without drawing in water vapor in a rain storm. But there is also the problem of battery power if the system is on batteries and a solar panel. With those limitations my answer, so far, is something between 1 watt and 2.5 watts of power to run the fan. The fan I am using presently is 2" x 2" (5cm x 5cm) @ 2.5 watts and is small enough to fit on its side, between radiation shield rings, near the temp sensor on a Davis Vantage Pro.. But I do not have information yet about heavy dew or rain and if it will be sucked in by the fan and onto the sensors. I will probably build a small baffle to protect the sensors. I guess if my temp and humidity go blank, I have to much fan power!
Actually any flow is better then none, it is a matter of reducing the temperature lag to a minimum. I would prefer to have a temp lag of less than 1 minute because I love to watch and understand "nano" climates, as I have come to call them. A few days ago, I saw a 4 degree F rise and fall, with no wind that I could feel or measure, all happening in less than 2 minutes. What was that?
As for "Nyquist", I am presently using an analog, cabled, temperature sensor measuring to 1/100th of a degree. The sensor and electronics are a project I am working on. I am using two stages of precision instrumentation op amps, with extreme cable shielding. I can hold my hand 4 inches from the temp sensor and see the temp rise 5/100s and then fall again as I move my hand away. So for me, temp readings are real-time and cpu cycle times are not an issue yet...but you raise a very real potential problem.
For "WeatherHost" there would definately be a fan control switch and in a fancy system, the fan speed would change as the difference between outside air temp and sensor temp increased - to conserve battery power. If CPU fan speeds change in high-end PCs and laptops we can do it with weather stations.
For DanS - that trace is almost proof in itself that data is being missed!
Thank you for your comments!!!
Steve