Jump to content
  • Member Statistics

    17,606
    Total Members
    7,904
    Most Online
    ArlyDude
    Newest Member
    ArlyDude
    Joined

Iceagenow Blog Complains that Record Lows Are Being Ignored


donsutherland1

Recommended Posts

  • Replies 313
  • Created
  • Last Reply

I believe Dr Ryan Mau has been doing a check list on the silly media headlines on the "greatest heat wave to hit mankind" Kinda funny. This should no doubt free the north pole of ice by October.

If I'm to believe that it is human generated then what is the explanation for the incredible heat waves of the medievel times,1920's ,1930's and with less humans around, not to mention a vastly less industrialized world? Why was it so hot? I'm really struggling with that.

If some records have stood since then, and many still do, 80-100 yrs ago, doesn't that cause you to wonder: geeze it musta been hot then too. This is not new to our climate!

So, extreme heat broke how many records? Out of how many? 3,000, 300k, or 3 million? How many records weren't broken -- is a good question.

And check this out, not government funded: http://www.uni-mainz.de/eng/15491.php

Very interesting to say the least.

Several quick points:

1. The heat wave was not unprecedented. It was extreme.

2. Variability driven by natural factors doesn't disappear in a warming climate. The distribution of temperatures shifts to the right. Moreover, research indicates that variability also increases, meaning that the curve flattens, with wider tails. The probability of high temperatures has increased (increased means + steady or increased standard deviation). This assumes that temperatures are normally distributed over a climatic period and will remain normally distributed.

To illustrate this concept, let's use a hypothetical city and assume in Climate State 0, the mean July-August high temperature was 85° and the standard deviation was 5°. A 100° reading would be a 3-sigma event. In Climate State 1, the mean July-August temperature increased to 87.5° and the standard deviation remained unchanged at 5°. A 100° reading would now be a 2.5 sigma event. In practical terms, a 100° day in July-August would have a statiscal probability of occurring once every 12 years in the earlier climate state and once every 2.6 years in the latter one. This example was for illustrative purposes, only.

3. The global temperature is continuing to increase (30-year moving average and 30-year trend lines). Warming has been most robust in the Arctic region, where that area's temperature trends have begun to diverge from the AMO. As the AMO eventually goes negative, it will likely damp Arctic warming, but the divergent relationship hints that the Arctic will generally remain warmer than normal (and probably much warmer than it was during the last AMO- cycle). The deepest and longest solar minimum since at least early in the 20th century did not eliminate the earth's energy imbalance that has been driving the ongoing warming.

4. Despite some speculation that 2012 is running cooler than the 1981-2010 climatic baseline, both the NCDC and GISS datasets reveal that the January-May period has been warmer than the 1981-2010 baseline. The emergence of an El Niño will very likely increase those warm anomalies.

Link to comment
Share on other sites

I believe Dr Ryan Mau has been doing a check list on the silly media headlines on the "greatest heat wave to hit mankind" Kinda funny. This should no doubt free the north pole of ice by October.

If I'm to believe that it is human generated then what is the explanation for the incredible heat waves of the medievel times,1920's ,1930's and with less humans around, not to mention a vastly less industrialized world? Why was it so hot? I'm really struggling with that.

If some records have stood since then, and many still do, 80-100 yrs ago, doesn't that cause you to wonder: geeze it musta been hot then too. This is not new to our climate!

So, extreme heat broke how many records? Out of how many? 3,000, 300k, or 3 million? How many records weren't broken -- is a good question.

And check this out, not government funded: http://www.uni-mainz.de/eng/15491.php

Very interesting to say the least.

From the above linked article:

"We found that previous estimates of historical temperatures during the Roman era and the Middle Ages were too low," says Esper. "Such findings are also significant with regard to climate policy, as they will influence the way today's climate changes are seen in context of historical warm periods."

Why is that? What does a more precise knowledge of past climate change have to do with what we expect to occur over the coming decades and centuries?

We are near certain that the decline in global temp over the past millennium is due to changes in orbital configuration. Likewise we are near certain that the ongoing current warming is a consequence of human alteration to the global greenhouse effect.

Snowlover123?

Link to comment
Share on other sites

Several quick points:

1. The heat wave was not unprecedented. It was extreme.

2. Variability driven by natural factors doesn't disappear in a warming climate. The distribution of temperatures shifts to the right. Moreover, research indicates that variability also increases, meaning that the curve flattens, with wider tails. The probability of high temperatures has increased (increased means + steady or increased standard deviation). This assumes that temperatures are normally distributed over a climatic period and will remain normally distributed.

To illustrate this concept, let's use a hypothetical city and assume in Climate State 0, the mean July-August high temperature was 85° and the standard deviation was 5°. A 100° reading would be a 3-sigma event. In Climate State 1, the mean July-August temperature increased to 87.5° and the standard deviation remained unchanged at 5°. A 100° reading would now be a 2.5 sigma event. In practical terms, a 100° day in July-August would have a statiscal probability of occurring once every 12 years in the earlier climate state and once every 2.6 years in the latter one. This example was for illustrative purposes, only.

3. The global temperature is continuing to increase (30-year moving average and 30-year trend lines). Warming has been most robust in the Arctic region, where that area's temperature trends have begun to diverge from the AMO. As the AMO eventually goes negative, it will likely damp Arctic warming, but the divergent relationship hints that the Arctic will generally remain warmer than normal (and probably much warmer than it was during the last AMO- cycle). The deepest and longest solar minimum since at least early in the 20th century did not eliminate the earth's energy imbalance that has been driving the ongoing warming.

4. Despite some speculation that 2012 is running cooler than the 1981-2010 climatic baseline, both the NCDC and GISS datasets reveal that the January-May period has been warmer than the 1981-2010 baseline. The emergence of an El Niño will very likely increase those warm anomalies.

Good post, Don.

Do you know if there are any places where I can find the standard deviations for certain weather sites for the summer months or will I just have to go through and figure them out myself based on temp records?

Link to comment
Share on other sites

Good post, Don.

Do you know if there are any places where I can find the standard deviations for certain weather sites for the summer months or will I just have to go through and figure them out myself based on temp records?

Msalgado,

Unfortunately, one has to calculate that information. it is rare for information to be provided in terms of standard deviations. A recent exception concerned the January-June temperature anomalies:

http://www.ncdc.noaa...emental/page-3/ (click on "Year-to-Date Anomalies")

I'd like to see a lot more of the information provided in a standardized format.

Link to comment
Share on other sites

Good post, Don.

Do you know if there are any places where I can find the standard deviations for certain weather sites for the summer months or will I just have to go through and figure them out myself based on temp records?

Don't know if it's close to what you're after, but the NOAA Year-to-date anomaly page lists standard deviations in the "Unusualness" column. For example, Austin is 2.0 sigma over the norm through June. In contrast, Nome is 1.6 sigma below normal for the same period.. (Be it ever so humble, there's no place like Nome)

Link to comment
Share on other sites

Don't know if it's close to what you're after, but the NOAA Year-to-date anomaly pagelists standard deviations in the "Unusualness" column. For example, Austin is 2.0 sigma over the norm through June. In contrast, Nome is 1.6 sigma below normal for the same period.. (Be it ever so humble, there's no place like Nome)

I have a feeling I'm just going to have to go through and run the averages myself based on historical data so that I can get the sigma values. I've been working on an undergraduate research project over the past year on future heat wave vulnerability. I've got GIS model data so I have average monthly temperature forecasts but with current sigma values I can then turn around and use that to project how many days above 90 or 100 the sites in question would experience based on modeled temperature forecasts. So, if the distribution values for these stations were already available I'd be able to save myself some work.

Link to comment
Share on other sites

I don't subscribe to Weatherbull, bit if you go to the premium site, you can see the firts few sentences of JBs post without having to pay - a teaser if you will. In one of them he rhetorically asks if North America is in for something that Russia went through - a very hot summer followed by a frigid winter. I believe he's referencing the fact that Moscow had an extremely hot summer in 2011, followed by a bitter winter in 2011-2012. According to his tweets, he seems to think it could be a cold run up to the November election. Sounds like he's going for a 1976-77 and 2002-2003 redux.

Link to comment
Share on other sites

73.2 to 76.9 is a huge change. Whenever cooling trends initiate, those cool temperature departures are

easily overcome and moderated in the direction of a warming climate trend. On the other hand, heat waves

and warm trends seem to have robust persistence.

If I were a buyer for a department store in the Northeast US, I wouldn't want to overbuy heavy winter clothing

else, the unsold inventory be sold off end of season at a loss.

I hear ya man, I remember when I was a kid and it used to be 0.4 degrees colder in lower Michigan, I used to bundle up... Now with it being 0.4 degrees warmer, I just go outside bare chested at 15F days in January. Big difference between my morning lows of -5F and what used to be -5.4F.

Joking aside, NYC is a heat island of the worst kind, the world is slightly warmer today then it was in 1950, but not 3 degrees. At least not in middle latitudes. You are comparing 1870 to 2010's (2.5 year period), use 2000's and its a 1.1 degree increase, more on line with world averages.

Link to comment
Share on other sites

I don't subscribe to Weatherbull, bit if you go to the premium site, you can see the firts few sentences of JBs post without having to pay - a teaser if you will. In one of them he rhetorically asks if North America is in for something that Russia went through - a very hot summer followed by a frigid winter. I believe he's referencing the fact that Moscow had an extremely hot summer in 2011, followed by a bitter winter in 2011-2012. According to his tweets, he seems to think it could be a cold run up to the November election. Sounds like he's going for a 1976-77 and 2002-2003 redux.

76-77? Yeah, I would love to see us get like that. That is crazy cold for this climate today. St. Louis that winter was 23-24F for the winter. I would imagine a lot of GL cities were in the teens for the winter average.

When folks ask if AGW is real, ask yourself if you really believe this would be possible right now?

St. Louis going 30/16/24 for a winter would be crazy compared to recent winters. It would be a 15F swing from last winter.

Standard Deviations for 76/77 were 2-3 over a large portion of the Eastern US with some areas above a 3 which is likely close to record.

cd208-1.png

Now 02-03. This is much more possible. But still very cold for the NE and generally the East. But no where near as absurd as any 1970s reference that is that cold.

cd208-2.png?t=1341966792

Link to comment
Share on other sites

I remember, in some of his tweets earlier this year, JB opined that we would see winters like the late 70s within the next few years. I suppose it's not completely out of the question - look at how cold eastern Europe and Russia were last winter.

It's not, it's frequency would be far more rare. That area over there also has a much better source of high albedo snow fields directly from the arctic.

I can't see widespread -10 to -20F temps that we saw before this far South, but I welcome it.

Link to comment
Share on other sites

Sorry for the very late response to this thread. Don Sutherland, thanks for replying to my critique about your statistics over on CWG. My main point was that you can not simply multiply the (low) probabilities of 100 degree days to obtain a combined (very low) probability of four or five 100 degree days in a row. Your otherwise valid statistics rely on a normal distribution of temperatures. But given a prior day of 100 degrees, the next day is not a normal distribution with the same statistics (probably skewed and certainly different average and std dev). Thus the higher probability of a subsequent 100 degree day would get multiplied, not the overall 100 dd probability.

A couple notes on UHI. It is not "debunked", but it can be adjusted for in computing global average trends. What usually happens in UHI is higher minimums due to retained heat and less mixing, not usually higher maximums. John N-G commented on WUWT that the Dallas Love Field readings were both warmer at night and cooler in the day than rural areas (he implied it was due to suburban irrigation). My own take was the light colored concrete at that airport is a great heat sink and not necessarily a solar absorber. Sensor errors consist of both such very local effects and the UHI of the broader local area. The very local effects can cool or warm.

Link to comment
Share on other sites

Sorry for the very late response to this thread. Don Sutherland, thanks for replying to my critique about your statistics over on CWG. My main point was that you can not simply multiply the (low) probabilities of 100 degree days to obtain a combined (very low) probability of four or five 100 degree days in a row. Your otherwise valid statistics rely on a normal distribution of temperatures. But given a prior day of 100 degrees, the next day is not a normal distribution with the same statistics (probably skewed and certainly different average and std dev). Thus the higher probability of a subsequent 100 degree day would get multiplied, not the overall 100 dd probability.

A couple notes on UHI. It is not "debunked", but it can be adjusted for in computing global average trends. What usually happens in UHI is higher minimums due to retained heat and less mixing, not usually higher maximums. John N-G commented on WUWT that the Dallas Love Field readings were both warmer at night and cooler in the day than rural areas (he implied it was due to suburban irrigation). My own take was the light colored concrete at that airport is a great heat sink and not necessarily a solar absorber. Sensor errors consist of both such very local effects and the UHI of the broader local area. The very local effects can cool or warm.

Welcome to AmericanWx, Eric.

I agree that you raised a fair point concerning consecutive 100° days. Conditional probability might make more sense for such stretches. Using sigma values to measure daily highs (or lows) is distinct from the matter of consecutive days.

Also, I didn't state that UHI is "debunked." It's a real phenomenon, but statistical adjustments can be applied. Hence, the global temperature record is not significantly impacted by UHI. The Berkeley Earth Surface Temperature (BEST) Project examined the global temperature record. Its paper on UHI can be found at: http://berkeleyearth.org/pdf/berkeley-earth-uhi.pdf

Link to comment
Share on other sites

Actually there was a recent paper published on this subject:

http://itia.ntua.gr/en/docinfo/1212/

We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant.

From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends.

One of the most common homogenization methods, ‘SNHT for single shifts’, was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence.

The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

Conclusions

1. Homogenization is necessary to remove errors introduced in climatic time

series.

2. Homogenization practices used until today are mainly statistical, not well

justified by experiments and are rarely supported by metadata. It can be

argued that they often lead to false results: natural features of hydroclimatic

time series are regarded errors and are adjusted.

3. While homogenization is expected to increase or decrease the existing

multiyear trends in equal proportions, the fact is that in 2/3 of the cases the

trends increased after homogenization.

4. The above results cast some doubts in the use of homogenization procedures

and tend to indicate that the global temperature increase during the

last century is smaller than 0.7-0.8°C.

5. A new approach of the homogenization procedure is needed, based on

experiments, metadata and better comprehension of the stochastic

characteristics of hydroclimatic time series.

Link to comment
Share on other sites

If I were a buyer for a department store in the Northeast US, I wouldn't want to overbuy heavy winter clothing

else, the unsold inventory be sold off end of season at a loss.

I disagree:

http://www.leif.org/EOS/2011GL050720.pdf

Temperatures and precipitation in the NE US are governed by solar activity changes' impacts on the NAO/AO.

Link to comment
Share on other sites

Snowlover123,

Thanks for sharing this presentation.

Four quick points:

1) The presentation validates the necessity of homogenization or adjustment to remove errors from the climate record (time series of data).

2) Data adjustment methodology is not perfect. It is reasonable. I strongly support efforts to further improve the methodology. A more accurate record is in the interests of all who study and/or report on the climate and its evolution.

3) The authors of the presentation you provided suggested that warming over the past century was in the +0.4°C to +0.7°C range (vs. the +0.7°C to +0.8°C suggested by the temperature data), so there's good agreement that appreciable warming has occurred.

4) The study in question was based on 181 sites. The larger BEST study examined 2000 randomly selected sites. Would the authors' findings have been similar had the authors examined a larger sample?

Link to comment
Share on other sites

that's a conference proceeding from EGU, not a peer-reviewed paper, so it should be salted to taste.

I agree, Wxtrix. Having said that, even this presentation contradicts the opinion of Mr. Goddard that was advanced in some of the discussions:

1. There was appreciable warming (not a decline in temperatures that was hidden)

2. Homogenization/adjustment are necessary (and don't amount to "manipulation")

Link to comment
Share on other sites

Roy Spencer did an in depth look at the USHCN data as well. His findings were closer to what Mr. Goddard has found.

LINK

Since NOAA encourages the use the USHCN station network as the official U.S. climate record, I have analyzed the average [(Tmax+Tmin)/2] USHCN version 2 dataset in the same way I analyzed the CRUTem3 and International Surface Hourly (ISH) data.

The main conclusions are:

1) The linear warming trend during 1973-2012 is greatest in USHCN (+0.245 C/decade), followed by CRUTem3 (+0.198 C/decade), then my ISH population density adjusted temperatures (PDAT) as a distant third (+0.013 C/decade)

2) Virtually all of the USHCN warming since 1973 appears to be the result of adjustments NOAA has made to the data, mainly in the 1995-97 timeframe.

3) While there seems to be some residual Urban Heat Island (UHI) effect in the U.S. Midwest, and even some spurious cooling with population density in the Southwest, for all of the 1,200 USHCN stations together there is little correlation between station temperature trends and population density.

4) Despite homogeneity adjustments in the USHCN record to increase agreement between neighboring stations, USHCN trends are actually noisier than what I get using 4x per day ISH temperatures and a simple UHI correction.

Given the amount of work NOAA has put into the USHCN dataset to increase the agreement between neighboring stations, I don’t have an explanation for this result. I have to wonder whether their adjustment procedures added more spurious effects than they removed, at least as far as their impact on temperature trends goes.

And I must admit that those adjustments constituting virtually all of the warming signal in the last 40 years is disconcerting. When “global warming” only shows up after the data are adjusted, one can understand why so many people are suspicious of the adjustments.

Link to comment
Share on other sites

Just for the sake of argument, if the temperature trend over the past century indicating a 0.74C warming is in error on the high side by a few tenths, what difference does it make to the larger picture. Spell it out!

To my way of thinking, this changes nothing. The physics is not altered. The studies indicating a most likely equilibrium response to a doubling of CO2 being 2.7C are not invalidated.

We believe the global temperature difference between interglacials and glacial periods is a minimum of 5C. If it is actually 3C, all that means is that the climate is considerably more sensitive to a temperature perturbation than we think.

Link to comment
Share on other sites

Just for the sake of argument, if the temperature trend over the past century indicating a 0.74C warming is in error on the high side by a few tenths, what difference does it make to the larger picture. Spell it out!

To my way of thinking, this changes nothing. The physics is not altered. The studies indicating a most likely equilibrium response to a doubling of CO2 being 2.7C are not invalidated.

We believe the global temperature difference between interglacials and glacial periods is a minimum of 5C. If it is actually 3C, all that means is that the climate is considerably more sensitive to a temperature perturbation than we think.

It makes little difference. The warming is real. It is ongoing. With global temperatures diverging from the solar forcing and oceanic cycles (PDO-AMO), the growing relative influence of rising atmospheric CO2 is also evident.

Also, given its larger sample size, I have more confidence in the BEST Project's findings.

Link to comment
Share on other sites

Roy Spencer did an in depth look at the USHCN data as well. His findings were closer to what Mr. Goddard has found.

LINK

Neither the small study conducted by Steirou et al., nor the comprehensive BEST Project concluded that data homogenization/adjustments account for virtually all of the observed warming. Even if the small study precisely and accurately measured warming at a lower 0.4°C to 0.7°C range, data adjustment would account for less than half of observed warming (and perhaps less than 10%). The small study is probably an outlier on one end. The BEST Project found that the major datasets are reasonable. The notion that the climate is warming only via adjustment is incorrect. Growing seasons, first/last freezes, plant hardiness zones, reduced summer sea ice and ice volume, glacial melt, etc., all indicate a warming climate. The warming is not some artifact of homogenization.

Link to comment
Share on other sites

This latest argument put forth by the skeptics purporting to show a reduction in the amount of actual global warming over the past century is an attempt to eliminate the need to explain warming due to anthropogenic greenhouse gases. After all, if there is no problem then why do we need an explanation? If there is little to worry about then why subject the economy to (supposedly) damaging mitigation strategies?

Another tactic has been to trash the global warming effect of CO2, after all you realize it constitutes such a tiny fraction of the atmosphere don't you. Oh, yea, water vapor is the most important greenhouse gas. We don't need to worry about CO2. How ridiculous. ETC., ETC.

What the skeptic hope you do not realize, or which they do not recognize themselves, is that the prediction of serious global warming and consequent climate change was born not of a need to explain any observed warming, but rather follows as a result of standard physics.

Link to comment
Share on other sites

Roy Spencer did an in depth look at the USHCN data as well. His findings were closer to what Mr. Goddard has found.

LINK

1) The linear warming trend during 1973-2012 is greatest in USHCN (+0.245 C/decade), followed by CRUTem3 (+0.198 C/decade), then my ISH population density adjusted temperatures (PDAT) as a distant third (+0.013 C/decade)

So, to summarize your summary ( :) ), what Spencer is saying is that his manipulation of the data is better than other people's manipulation of the data.

Doesn't that go against your belief that all manipulated data is untrustworthy? If you allow Spencer to manipulate the data, then you must allow for other methods of data manipulation.

Link to comment
Share on other sites

It's not really about Spencer's ISH Pdat data, it was more about his findings of the USHCN manipulation. Do I trust Spencer's manipulation over others? Not entirely. However I would say his are closer to being accurate than USHCN. What the GISS did in the Arctic should send up red flags to anyone who thinks those adjustments are being made without bias.

Link to comment
Share on other sites

Snowlover123,

Thanks for sharing this presentation.

Four quick points:

1) The presentation validates the necessity of homogenization or adjustment to remove errors from the climate record (time series of data).

2) Data adjustment methodology is not perfect. It is reasonable. I strongly support efforts to further improve the methodology. A more accurate record is in the interests of all who study and/or report on the climate and its evolution.

3) The authors of the presentation you provided suggested that warming over the past century was in the +0.4°C to +0.7°C range (vs. the +0.7°C to +0.8°C suggested by the temperature data), so there's good agreement that appreciable warming has occurred.

4) The study in question was based on 181 sites. The larger BEST study examined 2000 randomly selected sites. Would the authors' findings have been similar had the authors examined a larger sample?

You are quite welcome.

1) Agreed.

2) Agreed with the bolded, but this study suggests that the data adjustment methedology is more than not being just not perfect, it suggests a potentially large problem with the data adjustment methedology. Up to a 0.4 Degree difference is extremely significant, since this could be half of the "warming" observed in datasets.

3) Well actually, this difference is VERY significant, since it tells us how sensitive the climate is to an increase in Carbon Dioxide. If we assume that the planet has warmed by only 0.4 Degrees C verses 0.8 Degrees C, and assume that all of that warming was due to anthropogenic greenhouse gases (which is wrong, but for hypothetical purporses let us continue with this assumption). We get a sensitivity of only 1 Degree C with assuming that all of climate change up to date was due to anthropogenic greenhouse gases, which by no one's standards would represent a problem for the future or consider to be alarming. Assuming that CO2 caused all of the 0.8 Degree C warming, you would get sensitivties of around 2 Degrees C, which is a pretty large difference, climatologically speaking.

Of course, I maintain that there is much evidence for a dominant role for the sun as having caused most of the 20th Century Global Warming, which reduces the sensitivity to increased CO2 even further.

If we assume that Solar Activity has caused 70% of the warming, as is assumed by Scafetta and West, then you get extremely low sensitivities, especially if you use the warming over the last 100 years as being only 0.4 Degrees C.

I do not have an answer for your last bullet point, since I did not participate in this study.

Link to comment
Share on other sites

Just for the sake of argument, if the temperature trend over the past century indicating a 0.74C warming is in error on the high side by a few tenths, what difference does it make to the larger picture. Spell it out!

To my way of thinking, this changes nothing. The physics is not altered. The studies indicating a most likely equilibrium response to a doubling of CO2 being 2.7C are not invalidated.

See my analysis above.

In order to believe that Climate Sensitivity is around 2.7 Degrees C, you would have to believe that the planet has warmed by 1.1 Degrees C (a warming trend which no study or dataset has measured).

Aerosoles are currently at a 55 year low right now, which does not bode very well for the hypothesis that aerosoles are magically cancelling out warming to assume a 2.7 Degree sensitivity.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...