Jump to content
  • Member Statistics

    17,606
    Total Members
    7,904
    Most Online
    ArlyDude
    Newest Member
    ArlyDude
    Joined

Why A 30-Year "Normal"?


Recommended Posts

Why is 30 years the magic number? This question came to my mind when looking at April precipitation figures for Washington, DC. The 1971-2000 "normal" that was used until recently seemed to show that April is a dry month in D.C., with only 2.77 inches of precipitation on average. In fact, adjusted for the number of days in each month, the 0.092 inch daily average for April during that period ranked dead last among the 12 months, narrowly edging out February. However, the new D.C. 1981-2010 April "normal" is 3.06 inches, or 0.102 inches per day on average, which ranks 8th among the 12 months. Looking at the entire historical record, which dates to 1871, average April precipitation in D.C. been 3.13 inches, or 0.104 inches per day on average, which ranks 7th among the 12 months.

So, it appears that, in historical context, 1971-2000 April precipitation in D.C. was unusually low, giving the misleading impression to those who relied on the 1971-2000 "normal" during the past decade that April is an especially dry month in the nation's capital. While the percentage difference between 2.77 and 3.13 inches is not all that dramatic, eyeballing other data makes clear that a period as short as 30 years can produce more extreme percentage differences. Has consideration been given to using periods other than 30 years for normals?

Link to comment
Share on other sites

Why is 30 years the magic number? This question came to my mind when looking at April precipitation figures for Washington, DC. The 1971-2000 "normal" that was used until recently seemed to show that April is a dry month in D.C., with only 2.77 inches of precipitation on average. In fact, adjusted for the number of days in each month, the 0.092 inch daily average for April during that period ranked dead last among the 12 months, narrowly edging out February. However, the new D.C. 1981-2010 April "normal" is 3.06 inches, or 0.102 inches per day on average, which ranks 8th among the 12 months. Looking at the entire historical record, which dates to 1871, average April precipitation in D.C. been 3.13 inches, or 0.104 inches per day on average, which ranks 7th among the 12 months.

So, it appears that, in historical context, 1971-2000 April precipitation in D.C. was unusually low, giving the misleading impression to those who relied on the 1971-2000 "normal" during the past decade that April is an especially dry month in the nation's capital. While the percentage difference between 2.77 and 3.13 inches is not all that dramatic, eyeballing other data makes clear that a period as short as 30 years can produce more extreme percentage differences. Has consideration been given to using periods other than 30 years for normals?

I'm not a met but off the top of my head it seems to me that 30 years was chosen because it was long enough to give stability to the averages but at the same time short enough to show more recent trends if they differ from the longer term norms.

I'm sure a met will be along shortly to tell me I'm wrong. :sun:

Link to comment
Share on other sites

I'm not a met but off the top of my head it seems to me that 30 years was chosen because it was long enough to give stability to the averages but at the same time short enough to show more recent trends if they differ from the longer term norms.

I'm sure a met will be along shortly to tell me I'm wrong. :sun:

I don't know the exact rationale, but that seems like the most likely reason. There's certainly never been any kind of sensitivity study or anything if everyone uses an across-the-board 30 years.

I thought this might be a facet of the statistics discipline... like a sample size of 30 is considered a "normal" distribution or something geeky like that.

I forget all these things. :arrowhead:

Nope. A sample size of 30 is considered "large enough" when the objects are sampled identically and independently. Daily climate data fails the latter test.

Link to comment
Share on other sites

I'm not a met but off the top of my head it seems to me that 30 years was chosen because it was long enough to give stability to the averages but at the same time short enough to show more recent trends if they differ from the longer term norms.

Not to mention different instruments being used over time. I'd assume that one reason they don't use a longer period is to keep the data record as homogenous as possible.

Link to comment
Share on other sites

Not to mention different instruments being used over time. I'd assume that one reason they don't use a longer period is to keep the data record as homogenous as possible.

What I'm wondering is how useful the normals are, at least when it comes to precipitation. Here is another example from historical Washington, DC data: For the entire period of record, 1871-2010, June precipitation has averaged 3.75 inches, making it the 3rd rainiest month in D.C. However, during 1971-2000, June D.C. precipitation averaged only 3.13 inches, dropping June down into the middle of the pack among the 12 months. But with the recent shift to 1981-2010 data, June D.C. precipitation rises to near the top of the pack, at 3.77 inches. So, the 1971-2000 data appear to have been a short-term aberration, which may have suggested a fundamental shift in D.C. precipitation that, in reality, did not take place.

Link to comment
Share on other sites

The Earth is 5 billion years old. 30 years isn't a drop in the bucket as to what's "normal" or "not normal." We really should have "normal temperature readings" 100 years removed from the last ice age up until 100 years ago. That would give us a proper sample size. Unfortunately, human didn't really begin keeping records until the mid-late 1800s... soooo we basically have no idea what normal and abnormal is... that is except if you're Al Gore and then everything is "above normal."

Link to comment
Share on other sites

I'm not a met but off the top of my head it seems to me that 30 years was chosen because it was long enough to give stability to the averages but at the same time short enough to show more recent trends if they differ from the longer term norms.

I'm sure a met will be along shortly to tell me I'm wrong. :sun:

Yes I think that's it. Maybe that's not the rationale but that's how it works out. Maybe it is the rationale.

What I'm wondering is how useful the normals are, at least when it comes to precipitation. Here is another example from historical Washington, DC data: For the entire period of record, 1871-2010, June precipitation has averaged 3.75 inches, making it the 3rd rainiest month in D.C. However, during 1971-2000, June D.C. precipitation averaged only 3.13 inches, dropping June down into the middle of the pack among the 12 months. But with the recent shift to 1981-2010 data, June D.C. precipitation rises to near the top of the pack, at 3.77 inches. So, the 1971-2000 data appear to have been a short-term aberration, which may have suggested a fundamental shift in D.C. precipitation that, in reality, did not take place.

I did a statistical test posted back on Eastern which showed that the strongest correlation between any year's annual snowfall was the 30-year mean, not the 50, or 100 year mean. In other words, statistically, a 30 year mean predicts next year's (or next decade's) snowfall better than a 50 or 100 year mean.

I didn't do this test for precip or temperature, but I would imagine it is especially true for temperature, and probably somewhat true for precip.

As others have pointed out though (ORH), no computed average can substitute for a good knowledge of long-term weather patterns.

Link to comment
Share on other sites

The Earth is 5 billion years old. 30 years isn't a drop in the bucket as to what's "normal" or "not normal." We really should have "normal temperature readings" 100 years removed from the last ice age up until 100 years ago. That would give us a proper sample size. Unfortunately, human didn't really begin keeping records until the mid-late 1800s... soooo we basically have no idea what normal and abnormal is... that is except if you're Al Gore and then everything is "above normal."

Right... that's exactly how it works.

:rolleyes:

Link to comment
Share on other sites

The Earth is 5 billion years old. 30 years isn't a drop in the bucket as to what's "normal" or "not normal." We really should have "normal temperature readings" 100 years removed from the last ice age up until 100 years ago. That would give us a proper sample size. Unfortunately, human didn't really begin keeping records until the mid-late 1800s... soooo we basically have no idea what normal and abnormal is... that is except if you're Al Gore and then everything is "above normal."

The term "Normal" whether it is stated or not is always in reference to a particular time frame. Thus we can speak of the late 20th century normal, the 20th century normal, the 1,000 year normal, normal for the late Holocene etc.

When scientists say global temperatures are above normal they are usually referencing a 20th century normal or something similar. Although we are also near the highest temperatures of the last 2,000 and even 10,000 years.

Link to comment
Share on other sites

I really think that the term "normal" is misused. When we talk about the average high/low temperatures for a date, we should say "average", not "normal", because normal implies that it should be that particular temperature on that date when we all know that it rarely ever is that temperature and, instead, it is an average of all the variations of temperature from the last 30 years.

Link to comment
Share on other sites

I really think that the term "normal" is misused. When we talk about the average high/low temperatures for a date, we should say "average", not "normal", because normal implies that it should be that particular temperature on that date when we all know that it rarely ever is that temperature and, instead, it is an average of all the variations of temperature from the last 30 years.

Normal refers to the normal distribution in probability theory, a branch of statistics.

Link to comment
Share on other sites

Normals for a 30 year span are updated every 10 years. The next group of normals will be 1981-2010. These will probably be available in the next year or so from the National Climatic Data Center - these are the people that are charged with computing normals. 30 years is chosen because it does fit into a 'statistically significant' range. Normals are updated every 10 years to keep current. That applies to all first order stations and many cooperative observer stations. When possible, change in instrumentation is taken into account. The averages are smoothed using a 'spline fit'. More info (than you will ever want to know) can be found here: http://lwf.ncdc.noaa.gov/oa/climate/normals/usnormals.html

Link to comment
Share on other sites

I really think that the term "normal" is misused. When we talk about the average high/low temperatures for a date, we should say "average", not "normal", because normal implies that it should be that particular temperature on that date when we all know that it rarely ever is that temperature and, instead, it is an average of all the variations of temperature from the last 30 years.

Using the term "average" is not quite correct. For example, in Washington, DC, the average (mean) high temperature during 1971-2000 was 66.2 degrees Fahrenheit for April 14th and was 64.5 degrees F for April 15th. However, the 1971-2000 "normal" high temperature for both days is 66 degrees F. The reason for this is that the normalization process, which is based on a complex mathematical model, smooths out short-term aberrations, or "noise." The question I'm raising here is whether 30 years is always the most appropriate period to use for a normal.

Link to comment
Share on other sites

The term "Normal" whether it is stated or not is always in reference to a particular time frame. Thus we can speak of the late 20th century normal, the 20th century normal, the 1,000 year normal, normal for the late Holocene etc.

When scientists say global temperatures are above normal they are usually referencing a 20th century normal or something similar. Although we are also near the highest temperatures of the last 2,000 and even 10,000 years.

Really? How do you know that for sure?

Link to comment
Share on other sites

Normals for a 30 year span are updated every 10 years. The next group of normals will be 1981-2010. These will probably be available in the next year or so from the National Climatic Data Center - these are the people that are charged with computing normals. 30 years is chosen because it does fit into a 'statistically significant' range. Normals are updated every 10 years to keep current. That applies to all first order stations and many cooperative observer stations. When possible, change in instrumentation is taken into account. The averages are smoothed using a 'spline fit'. More info (than you will ever want to know) can be found here: http://lwf.ncdc.noaa.../usnormals.html

As of the 1981-2010 normals, the spline fit method will no longer be used and a different method of constrained smoothing will take its place. The new method will be an application of filtering theory based on a window of time centered on any given date to derive the "normal." For example, to compute the normal for May 8 using a two week window, the 450 data points from May 1-15 would be used in the computation. I don't know the exact filtering weights/parameters and window width being used since it was still being determined and experimental as recently as last fall. The new normals will be officially published and finalized by October, and the extended normals data set available April 2012 at last check.

Link to comment
Share on other sites

As of the 1981-2010 normals, the spline fit method will no longer be used and a different method of constrained smoothing will take its place. The new method will be an application of filtering theory based on a window of time centered on any given date to derive the "normal." For example, to compute the normal for May 8 using a two week window, the 450 data points from May 1-15 would be used in the computation. I don't know the exact filtering weights/parameters and window width being used since it was still being determined and experimental as recently as last fall. The new normals will be officially published and finalized by October, and the extended normals data set available April 2012 at last check.

I have a problem with smoothed numbers...It's fine for daily 'normals' but not for the monthly normals...October's smoothed normal in NYC is a half of a degree below the average for the 30 years of records...

Link to comment
Share on other sites

I have a problem with smoothed numbers...It's fine for daily 'normals' but not for the monthly normals...October's smoothed normal in NYC is a half of a degree below the average for the 30 years of records...

Monthly averages are not smoothed. The 12 monthly averages for the 30 year normals period are used to generate the 365 daily normals so that the average of the generated daily normal values equal the monthly average. The daily values you have to compute your average may not be the same that the NCDC used which may account for the half degree difference.

Link to comment
Share on other sites

Monthly averages are not smoothed. The 12 monthly averages for the 30 year normals period are used to generate the 365 daily normals so that the average of the generated daily normal values equal the monthly average. The daily values you have to compute your average may not be the same that the NCDC used which may account for the half degree difference.

I havent looked into it much but it seemed that monthly temp avgs are smoothed.. as are monthly snow. Precip is a straight avg I think.

Link to comment
Share on other sites

I havent looked into it much but it seemed that monthly temp avgs are smoothed.. as are monthly snow. Precip is a straight avg I think.

It's hard to say, but it doesn't make sense to apply smoothing to the monthly numbers. I was under the impression that was not done and assumed any differences we see are because the daily data they use and the one that is generally available to the public may be slightly different. The biggest difference being the estimation of missing data in the serially complete data the NCDC uses to compute normals. I suppose the data may also undergo additional QC algorithms and adjustments.

Link to comment
Share on other sites

As of the 1981-2010 normals, the spline fit method will no longer be used and a different method of constrained smoothing will take its place. The new method will be an application of filtering theory based on a window of time centered on any given date to derive the "normal." For example, to compute the normal for May 8 using a two week window, the 450 data points from May 1-15 would be used in the computation. I don't know the exact filtering weights/parameters and window width being used since it was still being determined and experimental as recently as last fall. The new normals will be officially published and finalized by October, and the extended normals data set available April 2012 at last check.

Good to know - I remember reading that using a spline fit was not always the case for computing normals in some earlier decades. This will be interesting.

Link to comment
Share on other sites

Monthly averages are not smoothed. The 12 monthly averages for the 30 year normals period are used to generate the 365 daily normals so that the average of the generated daily normal values equal the monthly average. The daily values you have to compute your average may not be the same that the NCDC used which may account for the half degree difference.

NYC's October average is 56.6 smoothed...The 30 Octobers from 1971-2000 average 57.1...A half of a degree difference...Most of the other monthly normals are different from the 30 year average...

Link to comment
Share on other sites

NYC's October average is 56.6 smoothed...The 30 Octobers from 1971-2000 average 57.1...A half of a degree difference...Most of the other monthly normals are different from the 30 year average...

Is it possible it's not smoothed and they just used different daily data than you did?

Link to comment
Share on other sites

Correct, at least as far as the 1971-2000 Washington, DC precipitation normals go. I presume it's that way for all precipitation normals.

It sort of makes sense if you think about it since precip doesnt necessarily follow a normal curve like you'd expect temps to. I assume they smooth monthly snow avgs because snow sort of follows a curve as well, with jan/feb being snowiest (here at least). I did some e-mailing with LWX's climate guy but don't fully understand the smoothing process.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...