Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
If a heat index of 148 has occurred in the U.S, and heat indexes over 130 F not being uncommon, that would mean an actual temperature of 155 F with a humidity of 1-2 percent would actually be more "survivable" than most people think.............So is my fictional climate of LOCO still "instantly lethel" as it has summer temperatures in the 150s range as averages but humidity levels around 1 percent? //www.city-data.com/forum/weath...mate-loco.html
I would imagine that a 'dry 150 F' in LOCO (even though it has never been recorded in the real world) would be far more tolerable than the highest dewpoint ever recorded in the 'real' world- What was the highest heat index ever recorded in the world
To my understanding, the heat index is a subjective scale, calibrated to a particular dewpoint. For example, some heat indexes are calibrated to a 60 degree dew point, meaning when the dew point is 60, the heat index is equal to the dry bulb temperature. However, this means that low dewpoints, particularly in high temperatures, generate a heat index that is lower than the dry bulb temperature. Thus, the temperature feels lower than it actually is, because one assumes the observer is used to higher dewpoints. This is why heat indexes can vary from source to source. Dewpoint is a more reliable way to determine oppressive heat because it is an absolute humidity scale, as opposed to a relative humidity scale.
Dewpoint is a far more accurate way of determining the perceived temperature than dry bulb temp when dewpoints are above 60. This is how Houston, New Orleans, and Miami can generate astronomical heat indexes while never hitting triple digits, while Phoenix can post tempatures in the 120's and have a lower heat index.
To my understanding, the heat index is a subjective scale, calibrated to a particular dewpoint. For example, some heat indexes are calibrated to a 60 degree dew point, meaning when the dew point is 60, the heat index is equal to the dry bulb temperature. However, this means that low dewpoints, particularly in high temperatures, generate a heat index that is lower than the dry bulb temperature. Thus, the temperature feels lower than it actually is, because one assumes the observer is used to higher dewpoints. This is why heat indexes can vary from source to source. Dewpoint is a more reliable way to determine oppressive heat because it is an absolute humidity scale, as opposed to a relative humidity scale.
Dewpoint is a far more accurate way of determining the perceived temperature than dry bulb temp when dewpoints are above 60. This is how Houston, New Orleans, and Miami can generate astronomical heat indexes while never hitting triple digits, while Phoenix can post tempatures in the 120's and have a lower heat index.
I disagree. If it's 105 with a dewpoint of 65, that'll feel hotter than if it's 85 with a dewpoint of 70. I think the heat index is a fair way to combine the two.
I've noticed it usually gives a higher estimate of how warm it feels, especially when high humidity kicks in at relatively mild temperatures (say in the 70s). Heat index is usually nada until temps climb higher, into the 80s. And yet in my experience, a relatively mild 74F with 90% humidity feels way warmer than 74F with 20% humidity. The humidex tends to reflect this, the heat index never seems to.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.