Sunburn at Different Dewpoints. (warm, averages, temperatures, days)
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
Can lower dewpoints offset higher sun angles, in relation to sunburn?
Friends currently visiting from the Gold Coast are saying that they have had worse sunburn during 2 days here, than at any time during the spring back in Queensland. I was doubtful but they do look sunburnt, although not that badly. They said the sun felt worse than expected -even though it hasn't been that sunny for the last 2 days.
Lower dewpoints resulting in a clearer atmosphere seemed like the obvious reason. UV is reasonably high at about 11, but I would think the Gold Coast would be higher. Dewpoints are much lower here though, at about 16-17 C/62F yesterday.
Has anyone noticed a big difference in burn time, due to humidity?
Can lower dewpoints offset higher sun angles, in relation to sunburn?
Friends currently visiting from the Gold Coast are saying that they have had worse sunburn during 2 days here, than at any time during the spring back in Queensland. I was doubtful but they do look sunburnt, although not that badly. They said the sun felt worse than expected -even though it hasn't been that sunny for the last 2 days.
Lower dewpoints resulting in a clearer atmosphere seemed like the obvious reason. UV is reasonably high at about 11, but I would think the Gold Coast would be higher. Dewpoints are much lower here though, at about 16-17 C/62F yesterday.
Has anyone noticed a big difference in burn time, due to humidity?
My experience as a beach goer is that humidity can reduce the threat of serious sunburn (and lengthen the time before you get burned). Although 62 F is not all that low of a dew point per sa.
Still, I’ve gotten a MUCH worse sunburn in the winter in the Caribbean or Florida than I would ever get in summer.
My experience as a beach goer is that humidity can reduce the threat of serious sunburn (and lengthen the time before you get burned). Although 62 F is not all that low of a dew point per sa.
Still, I’ve gotten a MUCH worse sunburn in the winter in the Caribbean or Florida than I would ever get in summer.
That's interesting. Even though the winter UV is much lower, the lower dewpoints compensate meaning more sunburn.
I was meaning the 16C dew point was low compared to Queensland. It's reasonably high by local standards, with 20C/68F being at the higher end here.
My friends were busy slapping on sunblock this morning, as they were heading out on the water today - a good place to get fried.
Can lower dewpoints offset higher sun angles, in relation to sunburn?
Friends currently visiting from the Gold Coast are saying that they have had worse sunburn during 2 days here, than at any time during the spring back in Queensland. I was doubtful but they do look sunburnt, although not that badly. They said the sun felt worse than expected -even though it hasn't been that sunny for the last 2 days.
Lower dewpoints resulting in a clearer atmosphere seemed like the obvious reason. UV is reasonably high at about 11, but I would think the Gold Coast would be higher. Dewpoints are much lower here though, at about 16-17 C/62F yesterday.
Has anyone noticed a big difference in burn time, due to humidity?
Dew points of 16-17C are actually close to average for the Gold Coast at this time of year. The max UV index forecast for the Gold Coast today is 12, so not much higher than yours. Take a look here: Ultraviolet (UV) Index Forecast
I remember getting a little sunburnt when I visited both Tasmania and New Zealand. It really caught me off guard because New Zealand/Tasmanian summer temperatures are usually accompanied by a UV rating of only about 5-7 at home.
Dew points of 16-17C are actually close to average for the Gold Coast at this time of year. The max UV index forecast for the Gold Coast today is 12, so not much higher than yours. Take a look here: Ultraviolet (UV) Index Forecast
I remember getting a little sunburnt when I visited both Tasmania and New Zealand. It really caught me off guard because New Zealand/Tasmanian summer temperatures are usually accompanied by a UV rating of only about 5-7 at home.
Somewhere like Dalby sounds like a bad place for sun burn, with higher UV and (presumably) lower dewpoints.
My friends were similarly surprised as you, not expecting to get so sunburnt at 23C with only dull sunshine.
Can lower dewpoints offset higher sun angles, in relation to sunburn?
Friends currently visiting from the Gold Coast are saying that they have had worse sunburn during 2 days here, than at any time during the spring back in Queensland. I was doubtful but they do look sunburnt, although not that badly. They said the sun felt worse than expected -even though it hasn't been that sunny for the last 2 days.
Lower dewpoints resulting in a clearer atmosphere seemed like the obvious reason. UV is reasonably high at about 11, but I would think the Gold Coast would be higher. Dewpoints are much lower here though, at about 16-17 C/62F yesterday.
Has anyone noticed a big difference in burn time, due to humidity?
I haven't really noticed any major direct relationship with dewpoints, and while it makes sense that higher dewpoints are an indicator of cloudier conditions or a mistier atmosphere I've also heard that under partly cloudy skies the UV-B rays can actually be increased by up to 40%.
It's also possible that since they're on holiday they've been outside a lot more than than usual and just have had more chance to burn as well? When I was in London on holiday in September '09 I got mildly sunburned on a mostly sunny, warm day (around 27C iirc) after having been outside all day long. Mind you I was still pretty surprised as I can't imagine the UV index in London a couple of weeks before equinox would be much higher than Perth in midwinter.
In my experience, unless you get dark overcast as if it looks like it could rain, typical "bright" overcast day cuts the UV level only in half... Plus the diffused light bounces easier so you feel the UV coming upwards and sideways, not just from the direction of sunshine. Hazy skies and high feelings probably only reduce y the UV by 10-25%?
Water vapour is far more effective at filtering infrared so that's why cars and buildings stay cool on cloudy days is my best guess
That's interesting. Even though the winter UV is much lower, the lower dewpoints compensate meaning more sunburn.
I was meaning the 16C dew point was low compared to Queensland. It's reasonably high by local standards, with 20C/68F being at the higher end here.
My friends were busy slapping on sunblock this morning, as they were heading out on the water today - a good place to get fried.
I really have never looked up the differences between UV index in summer vs winter...but but are you sure the difference is that great? I would think the closer one gets to the lower latitudes the less difference bwteen high sun (summer) and low sun (winter). Again, I've never looked them up so it's just a guess.
Maybe in a place like Gold Coast or Brisbane, it's not really the change in UV (high sun vs low sun) since it's fairly high all year - it's the fact like others have mentioed that the drier time of year with lower dew points/RH one forgets how long they have been sitting in the sun. I know this has some merrit...as I've done this in Florida many times; On typical June day when the temp is 87 F and the dew point is near 68-70 F...the high heat index makes you go in and out of the sun and beach days are not as long....vs in the winter months with a high of 78 F, clear skies, and low dew points, people often fall asleep in the sun for 4 or 5 hrs. Then they wake up and they have been burned.
I really have never looked up the differences between UV index in summer vs winter...but but are you sure the difference is that great? I would think the closer one gets to the lower latitudes the less difference bwteen high sun (summer) and low sun (winter). Again, I've never looked them up so it's just a guess.
Maybe in a place like Gold Coast or Brisbane, it's not really the change in UV (high sun vs low sun) since it's fairly high all year - it's the fact like others have mentioed that the drier time of year with lower dew points/RH one forgets how long they have been sitting in the sun. I know this has some merrit...as I've done this in Florida many times; On typical June day when the temp is 87 F and the dew point is near 68-70 F...the high heat index makes you go in and out of the sun and beach days are not as long....vs in the winter months with a high of 78 F, clear skies, and low dew points, people often fall asleep in the sun for 4 or 5 hrs. Then they wake up and they have been burned.
If one was in the lower lat
Summer UV averages about 8-10 here - peaking at about 13. Winter averages 2-3 - but sunburn is possible on rare occasions, even in July.
My friends were both well tanned when they got here and one works outside. They didn't spend an excessive amount of time in the sun either.
Using the argument of lower humidities, Could the PNW be a worse place for sunburn than Florida during summer?
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.