Humidity is a complicated concept. Humidity refers to evaporated water substance in the air, i.e. water vapor. For years relative humidity has been what we use to inform the TV audience, but it has many pit falls and to tell you the truth almost no one understands it, including many meteorologists.
When thinking of humidity you always have to think in terms of energy. Of course this is expressed as temperature. There is only so much energy to go around and only a certain part of the thermal energy can go to do the work of evaporation. The rest goes to the other molecules in the air.
Relative humidity expresses how much of the available energy has been used. A relative humidity of 50% means half the energy has been used to evaporate water from the ground, streams, lakes anywhere else it is and 50% is still available to do more evaporation.
On a summer morning the temperature may be 75 degrees and the relative humidity 90%, a very sticky morning indeed. Without changing the amount of water vapor in the air if the temperature hits 92 degrees in the afternoon the relative humidity is 52%. Relative humidity is RELATIVE TO THE AMOUNT OF ENERGY AVAILABLE Because the amount of energy increased as the sun warmed the atmosphere the percentage of the energy used decreased, i.e. the relative humidity, all the while there was no change in the amount of vapor in the air.
So when you hear someone say its feels worse than 52% relative humidity today, they do not understand the concept of RELATIVE HUMIDITY. Ninety-two degrees and 52% is a very humid afternoon. Because the concept is confusing many meteorologists are trying to get use dew point temperature.
Dew point temperature is a measure of humidity. If you take a parcel of air, and cool it, eventually you will remove enough energy to begin to get water vapor to condense. Remember the water vapor was originally liquid water and to get it to evaporate you had to add energy. As long as it has sufficient energy it will remain vapor, but as you cool it at some point condensation will occur. The temperature where condensation begins is the dew point temperature. In terms of relative humidity, as the parcel of air is cooled, the relative humidity increases, when the relative humidity reaches 100%, you are at the dew point temperature.
Unlike relative humidity if dew point increases, it is only because the amount of moisture increases. If relative changes it can be because of temperature change or moisture change, two variables leads to too many possibilities, with dew point it is strictly moisture you are tracking.
Dew point can never be higher than the temperature, at saturation, i.e. 100% relative humidity the temperature and dew point are the same.
On a typical summer day the following apply:
One last thing if you ever hear someone say it was 90 degrees and the humidity was 90%, that has never happened in Cincinnati, (and unless the greenhouse effect goes into overdrive never will). 90 deg/90% requires a dew point of 85.5 degrees. In Cincinnati the highest ever dew point was 81 deg. for just a few minutes.
In August 1995 we had four hours of 78,79,78,77 degree dew points, the highest persistent dew points I have seen in Cincinnati since working here as a meteorologist. For one hour I did see a dew point of 81, just after a thunderstorm.
However veterans of the Persian Gulf War know what 90/90 feels like. The Persian Gulf and Red Sea both attain sea water temperatures in the mid 90's. That's plenty of energy, along with the 115 degree air temperatures, to evaporate water.
The dew point has been measured on the shore of Ethiopia, the area is now part of Eritrea, at 94 deg. F. The highest known dew point temperature in the world. The relative humidity with a temp of 115 and a dew point of 94 is 54% this doesn't tell you as much as the dew point when you consider the table above.
© 1995 Steven L. Horstmeyer, all rights reserved