Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
To be clear, a deviation is not the same as a temperature anomaly. The standard deviation in January is more than 4.5°C, so when I get more than 3 deviations in January, we are talking 13.5 degrees C. By comparison, the standard deviation in October is only 1.2 degrees, so 3 deviations in October is 3.6 degrees. Deviations are the only way to properly compare seasons.
I was not talking about the standard deviation, just the difference from the average. I thought "deviation" alone (without standard) would have been a suitable word for "difference", but obviously not then..and yes I know what standard deviations are, I use them all the time in my weather records.
In my weather records, March and April have the high standard deviation, and October has the smallest. March 2013's cold snap was 1.6 deviations colder than average, December 2010's was 1.5 deviations colder than average. But, to any blithering idiot it was pretty obvious despite all that, that they were the coldest months for a very very long time (and before many people around today had been born).
By the way, deviation, without (standard) is still a perfectly adequate term to describe any difference from an average, in the English language.
July 2013 was 1.3 stdevs higher than average.
July 2006 was 1.9 stdevs higher than average.
April 2011 was 1.4 stdevs higher than average. Funny would have thought it was more.
March 2012 was only 1.0 stdevs higher than average.
The problem with my data though is that I don't really have enough years here since I've only been doing my records for less than 10 years so that will be affecting the results.
July 2013 was 1.3 stdevs higher than average.
July 2006 was 1.9 stdevs higher than average.
April 2011 was 1.4 stdevs higher than average. Funny would have thought it was more.
March 2012 was only 1.0 stdevs higher than average.
The problem with my data though is that I don't really have enough years here since I've only been doing my records for less than 10 years so that will be affecting the results.
That's why I prefer to use the national records more so than the local ones, since they go back so long and are so much better documented. I bet April 2011 would rival December 2010 and March 2013 as to which was the most exceptional month of recent times if you calculate the SD using the 1981-2010 figures for the CET, yet it will be forgotten so much sooner.
COLDEST MONTHS:
1) December 1740 = -3.6
2) January 1795 = -3.18
3) February 1947 = -3.14 (AKA negative pi)
4) January 1684 = -3.12
5) February 1895 = -3.10
HOTTEST MONTHS:
1) June 1846 = 3.56
2) May 1833 = 3.39
3) June 1676 = 3.38
4) August 1995 = 3.28
5) July 2006 = 3.24
6) April 2011 = 3.18
And what time period of averages are you comparing against?
The typical 30 year average in the late 18th century was rather lower than the modern 30 year periods.
1740 was a right joke of a year, Buxton probably qualified as a subarctic climate that year.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.