Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
Temperature and relative humidity are all you need to calculate dewpoint. I'd say anyone who's enough of a weather nerd to get annoyed that the TV weather forecaster doesn't say the dewpoint should probably just get better at doing that calculation in their head.
That's true, but the average viewer wouldn't be able to do that in their head.
I hate when the other people at the news desk ask stupid questions such as "When are we gonna get some relief from this heat?" in the middle of July or "When is it gonna warm up?" in February. Such stupid people.
This is in the DC area - mid Atlantic and realistically, a hot and muggy swamp. We can get snow anytime from November through April and usually do. Fall/spring are but extended versions of summer/winter.
Some fun reading about different ways to measure HUMIDITY
I hate when the other people at the news desk ask stupid questions such as "When are we gonna get some relief from this heat?" in the middle of July or "When is it gonna warm up?" in February.
Reminds me of this weather forecast from the "National Lampoon Sunday Newspaper Parody":
Hot and humid this summer with a cooling trend in the fall. Cold and wintery all winter with warmer weather following in the spring.
Relative humidity is important. For example, it tells you how extreme fire activity will be today, which we know will be a lot less extreme today given the relative humidity is 40% instead of the 9% that it was yesterday. Dew point didn't change.
Temperature and relative humidity are all you need to calculate dewpoint. I'd say anyone who's enough of a weather nerd to get annoyed that the TV weather forecaster doesn't say the dewpoint should probably just get better at doing that calculation in their head.
Funny you should point to an online calculator when admonishing people to get better at doing calculations in their head.
There is no simple equation for converting RH to Dewpoint that you can easily do in your head. The closest is Temp-((100-RH)/4, but that is just an approximation and only works with a certain RH range.
Or the weatherman could just tell us the dewpoint (useful info) and leave out the RH (not useful info). It would be similar to just giving you the wind chill or heat index and not the actual temp. Sure if they give you the wind chill and the wind speed you could come up with the temp, but isn't it easier and more relevant to just give us the temp.
Because relative humidity doesn't tell the whole story. It could be 0 F with 100% relative humidity or 100 F with 40% humidity, which one is more humid? Dew Point is a true measure of how humid it really is. The hotter the air is, the more moisture it can hold. That's why simply using relative humidity is stupid.
I think pretty much the opposite.
Sure, a 16c dewpoint means the air is more humid than 8c dewpoint, but 16c dewpoint with 30c temperature will feel less humid than 8c dewpoint with 10c temperature. I mean, physically feel.
Funny you should point to an online calculator when admonishing people to get better at doing calculations in their head.
There is no simple equation for converting RH to Dewpoint that you can easily do in your head. The closest is Temp-((100-RH)/4, but that is just an approximation and only works with a certain RH range.
Or the weatherman could just tell us the dewpoint (useful info) and leave out the RH (not useful info). It would be similar to just giving you the wind chill or heat index and not the actual temp. Sure if they give you the wind chill and the wind speed you could come up with the temp, but isn't it easier and more relevant to just give us the temp.
Most people I know are better able to figure out how humid the air would feel from RH than from dew point. It is more a matter of what you are used of. Wind chill is one atribute that is very useful here where we might have -30C plus windchill making it -40C. Hearing that tomorrow is going to be 35C with 20% RH tells people who understand RH (most people who live in areas that PH is what is reported most of the time) then to tell people it is going to be 35C with a dew point of 12C (or what ever it would be I just picked a number at random) We get wind chill and wind speed as well as the air temperature and the humidex but wind chill in winter and humidex in summer. It would make no sense for a humidex in the winter, they would be the same numbers and often are in the summer as well.
Information is useful if you understand it, for those who understand RH it is useful information. Same could be said of which temperature scale you used, our younger people might not have a clue if 87F is warm or hot and some in other countries would not know if 38C was hot or not. When I taught this stuff in a college first year lab the students knew RH and some had to learn dew point. I just do not get how you think RH is useless information. To me it just seems that you perfer dew point or are more used to that form of measurement.
East of the Rockies (and in the Mojave desert) dew point is way more indicative of comfort than relative humidity. So I consider RH a useless calculation. Nobody really gets what's a comfortable RH at any given temperature, but the dew point sure tells a lot.
Frankly the Weather Channel annoys me way more than RH. I have a low tolerance for outright stupidity.
Weather people are the only ones that can be wrong 95% of the time make all sort of mistakes and still keep their jobs. Never seen it to fail!
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.