Jump to content
  • Member Statistics

    17,588
    Total Members
    7,904
    Most Online
    LopezElliana
    Newest Member
    LopezElliana
    Joined

More emails released


LakeEffectKing

Recommended Posts

The map he posted was just a 1 day anomaly.

But in general the Antarctic has shown less warming than the arctic over the last 50 years. There is some evidence the antarctic went through a period of warming prior to 50 years ago.

The decline of stratospheric ozone (think of the huge south pole ozone hole) has also led to a strengthening of the southern stratospheric vortex, which leads to a more insular and cooler south pole climate.

Considering there is 90% of the planets ice in the southern hemisphere it seems pretty logical that even a slight change would show decline quickly. I really wish we could show ice extent 200+ years ago to see what it was like compared to today, but you can't measure ice that already melted. I understand the passion scientists have for this, but we are talking such a small sample of time.

Link to comment
Share on other sites

  • Replies 252
  • Created
  • Last Reply

Considering there is 90% of the planets ice in the southern hemisphere it seems pretty logical that even a slight change would show decline quickly. I really wish we could show ice extent 200+ years ago to see what it was like compared to today, but you can't measure ice that already melted. I understand the passion scientists have for this, but we are talking such a small sample of time.

There is some evidence from ship reports during the first half of the 20th century that antarctic sea ice was greater then, which would make sense given the probability that antarctica warmed significantly mid-century when the AAO changed states.

Link to comment
Share on other sites

So Gore and Hansen are now global financial experts? The schemes they've laid out, to try and hedge the so called hyopothesized consequences of an unproven hypothesis (CAGW) are KNOWN to them to have lesser consequences on humanity??? Symbolism over substance. All the significant consequences they assign some significant probability to, are postulates....that's it. The betterment of humanity includes doing a full assessment of BOTH the consequences of drastic action vs. inaction.....they are climate scientists, not financial experts.

If you don't already understand. You probably will never, and I certainly can not explain it to you.

FYI. Money is not real.

You can't eat and drink green pieces of paper.

Link to comment
Share on other sites

Expert Osborn thought the setting of the tree ring proxies to "missing" (post 1960) and then "infilling" might not be "defensaible!" (last sentence)

Yet, the questioning of this method by many skeptics before and after CG1.0 was met with strong defense by AGW'ers. Seems like skeptics were correct in being skeptical:

date: Mon, 16 Oct 2000 22:54:31 +0100

from: Tim Osborn <[email protected]>

subject: progress

to: [email protected], [email protected]

Hi Keith & Phil (a long one this, as I have an hour to kill!)

We’re making slow-ish progress here but it’s still definitely v. useful. I’ve

brought them up-to-date with our work and given them reprints. Mike and Scott

Rutherford have let me know what they’re doing, and I’ve got a preprint by

Tapio Schneider describing the new method and there’s a partially completed

draft paper where they test it using the GFDL long control run (and also the

perturbed run, to test for the effect of trend and non-stationarities). The

results seem impressive – and ‘cos they’re using model data with lots of

values set to missing, they can do full verification. The explained

verification variances are very high even when they set 95% of all grid-box

values to missing (leaving about 50 values with data over the globe I think).

In fact the new method (regularized expectation maximization, if that means

anything to you, which is similar to ridge regression) infills all missing

values (not just in the climate data), which is interesting for infilling

climate data from climate data, proxy data from climate data (see below).

As well as the GFDL data, they’ve also applied the method to the Jones et al.

data on its own. The method fills in missing temperatures using the

non-missing temperatures (i.e., similar to what Kaplan et al. do, or the

Hadley Centre do for GISST, but apparently better!). So they have a complete

data set from 1856-1998 (except that any boxes that had less than about 90

years of data remain missing, which seems fair enough since they would be

going too far if they infilled everything).

We’re now using the MXD data set with their program and the Jones et al. data

to see: (i) if the missing data from 1856-1960 in the Jones et al. data set

can be filled in better using the MXD plus the non-missing temperatures

compared to what can be achieved using just the non-missing temperatures. I

expect that the MXD must add useful information (esp. pre-1900), but I’m not

sure how to verify it! The program provides diagnostics estimating the

accuracy of infilled values, but it’s always nice to test with independent

data. So we’re doing a separate run with all pre-1900 temperatures set to

missing and relying on MXD to infill it on its own – can then verify, but need

to watch out for the possibly artificial summer warmth early on. We will then

use the MXD to estimate temperatures back to 1600 (not sure that their method

will work before 1600 due to too few data, which prevents the iterative method

from converging), and I will then compare with our simpler maps of summer

temperature. Mike wants winter (Oct-Mar) and annual reconstructions to be

tried too. Also, we set all post-1960 values to missing in the MXD data set

(due to decline), and the method will infill these, estimating them from the

real temperatures – another way of “correcting” for the decline, though may be

not defensible!

Link to comment
Share on other sites

Expert Osborn thought the setting of the tree ring proxies to "missing" (post 1960) and then "infilling" might not be "defensaible!" (last sentence)

Yet, the questioning of this method by many skeptics before and after CG1.0 was met with strong defense by AGW'ers. Seems like skeptics were correct in being skeptical:

date: Mon, 16 Oct 2000 22:54:31 +0100

from: Tim Osborn <[email protected]>

subject: progress

to: [email protected], [email protected]

Hi Keith & Phil (a long one this, as I have an hour to kill!)

We’re making slow-ish progress here but it’s still definitely v. useful. I’ve

brought them up-to-date with our work and given them reprints. Mike and Scott

Rutherford have let me know what they’re doing, and I’ve got a preprint by

Tapio Schneider describing the new method and there’s a partially completed

draft paper where they test it using the GFDL long control run (and also the

perturbed run, to test for the effect of trend and non-stationarities). The

results seem impressive – and ‘cos they’re using model data with lots of

values set to missing, they can do full verification. The explained

verification variances are very high even when they set 95% of all grid-box

values to missing (leaving about 50 values with data over the globe I think).

In fact the new method (regularized expectation maximization, if that means

anything to you, which is similar to ridge regression) infills all missing

values (not just in the climate data), which is interesting for infilling

climate data from climate data, proxy data from climate data (see below).

As well as the GFDL data, they’ve also applied the method to the Jones et al.

data on its own. The method fills in missing temperatures using the

non-missing temperatures (i.e., similar to what Kaplan et al. do, or the

Hadley Centre do for GISST, but apparently better!). So they have a complete

data set from 1856-1998 (except that any boxes that had less than about 90

years of data remain missing, which seems fair enough since they would be

going too far if they infilled everything).

We’re now using the MXD data set with their program and the Jones et al. data

to see: (i) if the missing data from 1856-1960 in the Jones et al. data set

can be filled in better using the MXD plus the non-missing temperatures

compared to what can be achieved using just the non-missing temperatures. I

expect that the MXD must add useful information (esp. pre-1900), but I’m not

sure how to verify it! The program provides diagnostics estimating the

accuracy of infilled values, but it’s always nice to test with independent

data. So we’re doing a separate run with all pre-1900 temperatures set to

missing and relying on MXD to infill it on its own – can then verify, but need

to watch out for the possibly artificial summer warmth early on. We will then

use the MXD to estimate temperatures back to 1600 (not sure that their method

will work before 1600 due to too few data, which prevents the iterative method

from converging), and I will then compare with our simpler maps of summer

temperature. Mike wants winter (Oct-Mar) and annual reconstructions to be

tried too. Also, we set all post-1960 values to missing in the MXD data set

(due to decline), and the method will infill these, estimating them from the

real temperatures – another way of “correcting” for the decline, though may be

not defensible!

Reads like a conversation between Bill Clinton skeptic Linda Tripp and Ken Starr

Don't you ever get tired of this?

Link to comment
Share on other sites

Reads like a conversation between Bill Clinton skeptic Linda Tripp and Ken Starr

Don't you ever get tired of this?

Not when it validates my skepticism. Many of the emails absolutely reek of confimation bias.

If you want to defend the implications of such an exchange between two "scientists", please continue to do so....It helps our "cause".

Link to comment
Share on other sites

The "decline" problem, also known as the divergence problem has been well documented in many studies. This is not trickery. One convenient paper which summarizes the problem is located here. http://www.wsl.ch/info/mitarbeitende//cherubin/download/D_ArrigoetalGlobPlanCh2008.pdf

It boils down to the fact that from the 1960s or so onward, the tree ring proxies haven't been behaving as they used to. They diverge from instrumental measurements over the recent past. The e-mail you posted just shows a couple scientists trying to figure out ways around this problem... this is how science is done, you bounce around ideas until you find something that works.

Link to comment
Share on other sites

The "decline" problem, also known as the divergence problem has been well documented in many studies. This is not trickery. One convenient paper which summarizes the problem is located here. http://www.wsl.ch/in...bPlanCh2008.pdf

It boils down to the fact that from the 1960s or so onward, the tree ring proxies haven't been behaving as they used to. They diverge from instrumental measurements over the recent past. The e-mail you posted just shows a couple scientists trying to figure out ways around this problem... this is how science is done, you bounce around ideas until you find something that works.

No it's not.....You present the data as is then, footnote it if there is a perceived problem......unless there is some agenda to fullfill....you cannot read the emails and NOT see there was a concerted effort to create a more "dire" presentation.....sorry!

Link to comment
Share on other sites

No it's not.....You present the data as is then, footnote it if there is a perceived problem......unless there is some agenda to fullfill....you cannot read the emails and NOT see there was a concerted effort to create a more "dire" presentation.....sorry!

Here's the thing. The raw data are floating around and you can get these data and see all the missing values. Studies rarely present data in its raw form... the point of a study is to do something to do the data. But, when studies require that statistical models be used to backcast and forecast global temperatures, the gaps must be filled in. Paleoclimatic reconstructions are really just fancy backcasting models. Common practice involves filling in these missing data using some sort of interpolation technique, which is what they are clearly discussing in this email. What is the best interpolation technique to use for this particular set of data? That is a loaded question and requires a lot of thinking and pondering to do it correctly.

Also, all interpolations are clearly spelled out in any paper that is published. This doesn't happen behind the scenes.

Link to comment
Share on other sites

Here's the thing. The raw data are floating around and you can get these data and see all the missing values. Studies rarely present data in its raw form... the point of a study is to do something to do the data. But, when studies require that statistical models be used to backcast and forecast global temperatures, the gaps must be filled in. Paleoclimatic reconstructions are really just fancy backcasting models. Common practice involves filling in these missing data using some sort of interpolation technique, which is what they are clearly discussing in this email. What is the best interpolation technique to use for this particular set of data? That is a loaded question and requires a lot of thinking and pondering to do it correctly.

Also, all interpolations are clearly spelled out in any paper that is published. This doesn't happen behind the scenes.

Not if it's done to only PART of the data. The raw data you speak of was "good enough" to be presented pre-1960. If one uses a certain proxy to ascertain a historical record (and present it in the visual graph) but then try and "hide the decline" because ONE part of the proxy has a percieved "problem", then it should definitely be pointed out as such at that time, on the graph. This part of the presented work was certainly an attempt to skew the perception. The phrase "hide the decline" wouldn't have been used behind the scenes if that wasn't the case. An alternate, more approriate statement of an objective observer would have been, "correct the problem with the proxy".

Looks bad, smells bad, and is bad.

Link to comment
Share on other sites

Not if it's done to only PART of the data. The raw data you speak of was "good enough" to be presented pre-1960. If one uses a certain proxy to ascertain a historical record (and present it in the visual graph) but then try and "hide the decline" because ONE part of the proxy has a percieved "problem", then it should definitely be pointed out as such at that time, on the graph. This part of the presented work was certainly an attempt to skew the perception. The phrase "hide the decline" wouldn't have been used behind the scenes if that wasn't the case. An alternate, more approriate statement of an objective observer would have been, "correct the problem with the proxy".

Looks bad, smells bad, and is bad.

Your claims are not built on any evidence whatsoever. Show me a paper where the tree ring data were used and the "decline" was hidden but not explained? Also, read the paper that I linked to above. It will help you understand why your quote "An alternate, more approriate statement of an objective observer would have been, "correct the problem with the proxy"." doesn't make sense.

The idea is that the data are not simply being plotted on a graph. There are lots of other things that the data are used for, so it's not as simple as just putting a mark where the data were interpolated. This is why the data and methodology section of a paper is so important to both read and understand before the results can be correctly interpreted.

Link to comment
Share on other sites

If you don't already understand. You probably will never, and I certainly can not explain it to you.

FYI. Money is not real.

You can't eat and drink green pieces of paper.

Wow, Friv. Time for you to lead by example and stop paying your imaginary bills...arrowheadsmiley.png

Hansen, Gore, and Mann will continue to torpedo efforts by more respected scientists on your side of the issue. You should not be defending them...

Link to comment
Share on other sites

Your claims are not built on any evidence whatsoever. Show me a paper where the tree ring data were used and the "decline" was hidden but not explained? Also, read the paper that I linked to above. It will help you understand why your quote "An alternate, more approriate statement of an objective observer would have been, "correct the problem with the proxy"." doesn't make sense.

The idea is that the data are not simply being plotted on a graph. There are lots of other things that the data are used for, so it's not as simple as just putting a mark where the data were interpolated. This is why the data and methodology section of a paper is so important to both read and understand before the results can be correctly interpreted.

The paper you cite as defense of such culling of a proxy, actually provides a decent argument to skeptics that tree ring proxies for temperature are highly uncertain, and are quite sensitive to stresses related to non-temperature influences (dimming, drought, disease, etc.) My overall point is that presenting proxies (in graphical form) that have a large uncertainty to begin with, should be presented as such, IMO, and not spliced together. In effect, the scientists involved seem to be playing both sides of the coin.....very certain of the historical significance of tree ring data as a temperature proxy, but uncertain enough to ponder the various stresses that COULD be reasons for the divergence in the very short term (last 45 or so years....)

Link to comment
Share on other sites

The paper you cite as defense of such culling of a proxy, actually provides a decent argument to skeptics that tree ring proxies for temperature are highly uncertain, and are quite sensitive to stresses related to non-temperature influences (dimming, drought, disease, etc.) My overall point is that presenting proxies (in graphical form) that have a large uncertainty to begin with, should be presented as such, IMO, and not spliced together. In effect, the scientists involved seem to be playing both sides of the coin.....very certain of the historical significance of tree ring data as a temperature proxy, but uncertain enough to ponder the various stresses that COULD be reasons for the divergence in the very short term (last 45 or so years....)

Of course they should, and they are. All data are presented with appropriate confidence intervals and significance levels. For the kinds of things that proxies are used for, they have to be spliced together. For example, if one wants an average global temperature reconstruction, one must put together proxies from all over the planet. Each proxy has different time and spatial resolutions and these need to be accounted for. If the proxies were not spliced together we wouldn't be able to get any useful information out of them.

Link to comment
Share on other sites

Not if it's done to only PART of the data. The raw data you speak of was "good enough" to be presented pre-1960. If one uses a certain proxy to ascertain a historical record (and present it in the visual graph) but then try and "hide the decline" because ONE part of the proxy has a percieved "problem", then it should definitely be pointed out as such at that time, on the graph. This part of the presented work was certainly an attempt to skew the perception. The phrase "hide the decline" wouldn't have been used behind the scenes if that wasn't the case. An alternate, more approriate statement of an objective observer would have been, "correct the problem with the proxy".

Looks bad, smells bad, and is bad.

You are all but saying the scientists are conspiring to delude the public about climate change ( as does the entire e-mail controversy ). Why don't you just come out and say it? You're a conspiracy theorist which you find apropos to your narrow minded view of how science should be conducted. You admit that you feel the scientists have a non-scientific agenda. You don't know of the methodologies and techniques these scientists use to do their research, but through the filter of your own limited knowledge and confirmation bias you see a conspiracy to defraud the public. A little ignorant knowledge can be a dangerous thing in the hands of ideologues such as yourself.

Link to comment
Share on other sites

The paper you cite as defense of such culling of a proxy, actually provides a decent argument to skeptics that tree ring proxies for temperature are highly uncertain, and are quite sensitive to stresses related to non-temperature influences (dimming, drought, disease, etc.) My overall point is that presenting proxies (in graphical form) that have a large uncertainty to begin with, should be presented as such, IMO, and not spliced together. In effect, the scientists involved seem to be playing both sides of the coin.....very certain of the historical significance of tree ring data as a temperature proxy, but uncertain enough to ponder the various stresses that COULD be reasons for the divergence in the very short term (last 45 or so years....)

The way I understand it, tree ring proxies correlate well with both the instrumental temperature record AND other proxy data obtained through different methodologies. Certain tree ring growth patterns ( not all, just some in the very high northern latitudes ) diverge following about 1960. They become an outlier amoungst data sources. Do you think they should keep obviously bad data as part of the temperature record? What would you say then about the integrity of the temperature record?

Link to comment
Share on other sites

The way I understand it, tree ring proxies correlate well with both the instrumental temperature record AND other proxy data obtained through different methodologies. Certain tree ring growth patterns ( not all, just some in the very high northern latitudes ) diverge following about 1960. They become an outlier amoungst data sources. Do you think they should keep obviously bad data as part of the temperature record? What would you say then about the integrity of the temperature record?

I think his claim is that since we can't trust it after 1960, we shouldn't trust it at all, and should completely eliminate it from the analyses...

Link to comment
Share on other sites

The way I understand it, tree ring proxies correlate well with both the instrumental temperature record AND other proxy data obtained through different methodologies. Certain tree ring growth patterns ( not all, just some in the very high northern latitudes ) diverge following about 1960. They become an outlier amoungst data sources. Do you think they should keep obviously bad data as part of the temperature record? What would you say then about the integrity of the temperature record?

Seems convenient that scrutiny was given to the proxy data when it inconveniently diverged, yet I doubt equal scrutiny was provided to data from centuries ago (nor could it be equally scrutinized without parallel direct measurement), hence the overall certainty that the proxy even has a measurable error range, IMO.

Link to comment
Share on other sites

Uhh... I fail to see the problem with any of the emails listed in the first post (not reading through the rest of this thread to see what else is addressed). It would certainly be nice to have the whole emails.

Some embarrassing exchanges for sure! Have at them!

http://noconsensus.w...limategate-2-0/

Just a few random ones:

<4241> Wilson:

I thought I’d play around with some randomly generated time-series and see if I

could ‘reconstruct’ northern hemisphere temperatures.

[...] The reconstructions clearly show a ‘hockey-stick’ trend. I guess this is

precisely the phenomenon that Macintyre has been going on about.

If I knew what was in the part cut out I could speak to just how "embarrassing" this email might be. "Randomly-generated" time-series could refer to something like a random initial period, where the results are going to come in the form of an ensemble. Given this email, it's absolutely impossible to tell what's going on. It could say in the [...] part "... if you bin the results and take the 10% with the strongest average positive slope..." And there could be more at the end of the email like "... this is not a valid method because..." We just can't tell based on the short bit we have here.

<4184> Jones:

[to Hansen] Keep up the good work! [...] Even though it’s been a mild winter in

the UK, much of the rest of the world seems coolish – expected though given the

La Nina. Roll on the next El Nino! :yikes:

Well yes, some people are so sick of people talking about the supposed recent downward "trend" in temperatures (me included) that a nice El Nino to get people to stop drawing idiotic conclusions would be preferable. Not scientific, but it's an email, not a scientific journal. People are allowed to express their opinions via email.

It's even possible (though I would imagine unlikely) that Jones is a snow-weenie and just wants a Nino for snow in his backyard. Again, lack of context here makes it absolutely impossible to draw any conclusions.

” Jones:

[FOI, temperature data]

Any work we have done in the past is done on the back of the research grants we

get – and has to be well hidden. I’ve discussed this with the main funder (US

Dept of Energy) in the past and they are happy about not releasing the original

station data.”

Yes, it's illegal to do certain things with data on certain research grants. Sorry, but we live in a country where some data is proprietary.

Steig:

He’s skeptical that the warming is as great as we show in East Antarctica — he

thinks the “right” answer is more like our detrended results in the

supplementary text. I cannot argue he is wrong.

How is this in ANY way "embarrassing"? Steig is saying someone suggested that a certain region might not see as much warming as he had reported, and he doesn't have any way to show whether the guy is right or wrong. If you see something wrong here, you must be reading a completely different sentence than me.

Link to comment
Share on other sites

You are all but saying the scientists are conspiring to delude the public about climate change ( as does the entire e-mail controversy ). Why don't you just come out and say it? You're a conspiracy theorist which you find apropos to your narrow minded view of how science should be conducted. You admit that you feel the scientists have a non-scientific agenda. You don't know of the methodologies and techniques these scientists use to do their research, but through the filter of your own limited knowledge and confirmation bias you see a conspiracy to defraud the public. A little ignorant knowledge can be a dangerous thing in the hands of ideologues such as yourself.

That's your judgement/opinion there I guess.....However, other warmers and luke warmers have indeed chimed in with similar "narrow minded" commentary:

http://davidappell.blogspot.com/2011/11/sorting-through-stolen-uae-emails.html

Link to comment
Share on other sites

I think his claim is that since we can't trust it after 1960, we shouldn't trust it at all, and should completely eliminate it from the analyses...

Not quite.....the paper discussing the divergence problem notes many potential issues with individual tree ring data as temperature proxies, including environmental factors that are not exclusively anthrolopogic. This, IMO, creates not only uncertainty with the data itself, but with establishing potential error, especially when we note the numerous factors that may/could effect the proxy. Maybe I've missed it, but I can't imagine that confidence is high (and it's hinted at in relevant emails) that study of the other factors from centuries ago, could be adequate enough to provide even modest certainty with the establishment of error range, wrt tree rings as temp. proxies.

Link to comment
Share on other sites

Uhh... I fail to see the problem with any of the emails listed in the first post (not reading through the rest of this thread to see what else is addressed). It would certainly be nice to have the whole emails.

If I knew what was in the part cut out I could speak to just how "embarrassing" this email might be. "Randomly-generated" time-series could refer to something like a random initial period, where the results are going to come in the form of an ensemble. Given this email, it's absolutely impossible to tell what's going on. It could say in the [...] part "... if you bin the results and take the 10% with the strongest average positive slope..." And there could be more at the end of the email like "... this is not a valid method because..." We just can't tell based on the short bit we have here.

Well yes, some people are so sick of people talking about the supposed recent downward "trend" in temperatures (me included) that a nice El Nino to get people to stop drawing idiotic conclusions would be preferable. Not scientific, but it's an email, not a scientific journal. People are allowed to express their opinions via email.

It's even possible (though I would imagine unlikely) that Jones is a snow-weenie and just wants a Nino for snow in his backyard. Again, lack of context here makes it absolutely impossible to draw any conclusions.

Yes, it's illegal to do certain things with data on certain research grants. Sorry, but we live in a country where some data is proprietary.

How is this in ANY way "embarrassing"? Steig is saying someone suggested that a certain region might not see as much warming as he had reported, and he doesn't have any way to show whether the guy is right or wrong. If you see something wrong here, you must be reading a completely different sentence than me.

Mallow,

I'm heading off to my sister's wedding. I'll address your responses at a later time. I certainly cannot argue that the redactions from some of the emails don't provide a moment for pause....however, the entire batch of emails certainly leaves one with the impression that those redactions are generally benign minutia, and not a context weeding exercise.

It is clear that the interpretations of the significance of these emails is wide ranging, mostly falling on the well established lines already drawn....but no doubt, those involved certainly can't be happy, and are rightly angered (if it was a hack), which speaks to the potential negative impact the release could have.

Link to comment
Share on other sites

No it's not.....You present the data as is then, footnote it if there is a perceived problem......unless there is some agenda to fullfill....you cannot read the emails and NOT see there was a concerted effort to create a more "dire" presentation.....sorry!

There entire life goal is proving a pre-determined outcome. They aren't climate scientists.. They are climate change proving scientists.

Not much money being generated in forecasting.

Google weather funding.

Link to comment
Share on other sites

Seems convenient that scrutiny was given to the proxy data when it inconveniently diverged, yet I doubt equal scrutiny was provided to data from centuries ago (nor could it be equally scrutinized without parallel direct measurement), hence the overall certainty that the proxy even has a measurable error range, IMO.

The uncertainties involved and the "divergence problem" are generally addressed head on in the peer-reviewed literature. If only you took the time to read it instead of countless hours demonstrating to us that you have not.

There is no conspiracy to present pre-1960 tree-ring data as gospel and then vanish the post-1960 data away without explanation. The issue is addressed head on and in full. All you have demonstrated here is your very shallow familiarity with the subject.

Link to comment
Share on other sites

It is clear that the interpretations of the significance of these emails is wide ranging, mostly falling on the well established lines already drawn....but no doubt, those involved certainly can't be happy, and are rightly angered (if it was a hack), which speaks to the potential negative impact the release could have.

Yes those involved are angered because idiots and hacks will again misinterpret and misrepresent and blow out of proportion the meaning of personal email correspondence.

Link to comment
Share on other sites

All of you are wrong, and the answer lies in between the two polar stances. It would be great if the two sides could just agree the other stance has merit.

Thank you for smoking.

This is precisely the view that the wingnut plutocrats who fund the activity of "skeptics" want you to take. They can count on lazy, science-ignorant journalists to take this position, since it requires no research into what is actually going on. The truth is that we are facing what is in all likelihood a severe problem that requires an unpleasant collective effort. Greedy people in oil/gas don't want to do their part, and are bringing out the dogs to make sure that they aren't forced to. I must say that the blindness of people who should really know better about this is really quite sickening.

Link to comment
Share on other sites

All of you are wrong, and the answer lies in between the two polar stances. It would be great if the two sides could just agree the other stance has merit.

I think most of us have already said that some of these emails demonstrate less than100% objectiveness. But they are NOT particularly scandalous or significant and I am not about to compromise on this fact.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...