Jump to content
  • Member Statistics

    17,611
    Total Members
    7,904
    Most Online
    NH8550
    Newest Member
    NH8550
    Joined

2015 Global Temperatures


nflwxman

Recommended Posts

All research funding should go into satellites... If there is a problem with them, fix it. This is a far superior data set to invest in.

 

The problem is there is very little repetition in satellite data.  If a dozen GHCN v3 stations are screwy, it barely effects the overall trend/dataset on GISS or HadCrut4.  If one spectrometer is screwy on an orbital remote sensing device, it can absolutely cause massive errors that need to be corrected in imprecise ways.  In order to provide less uncertainty in satellite data, NASA (or any other space program) would have to spend billions of dollars to launch more.

Link to comment
Share on other sites

  • Replies 1.3k
  • Created
  • Last Reply

All research funding should go into satellites... If there is a problem with them, fix it. This is a far superior data set to invest in.

 

Absolutely not.  For one, they don't measure the same thing.  Two, the costs are very different.  Launching satellites is a risky and expensive en devour while placing weather stations is cheap and easy.  This is not even mentioning other methods of taking temperature readings that are used for a variety of meteorological and scientific reasons.

 

Putting our eggs in one basket is absolutely foolish.

Link to comment
Share on other sites

Absolutely not.  For one, they don't measure the same thing.  Two, the costs are very different.  Launching satellites is a risky and expensive en devour while placing weather stations is cheap and easy.  This is not even mentioning other methods of taking temperature readings that are used for a variety of meteorological and scientific reasons.

 

Putting our eggs in one basket is absolutely foolish.

 

Yes, the easiest thing to do is to put in a network of pristine weather stations...which has already been done in the United States. It is called "USCRN". The beauty of that network is there is no need for adjustments that can (and have been in the past) be subjective out of necessity.

 

USCRN has only been collecting data for about a decade though so it isn't very useful yet for climate trends. There really needs to be one developed on a global basis, as well as the same for SST anomalies which have been subjected to massive adjustments (see the change in ERSSTv4 earlier this year that made headlines everywhere).

 

All of this would be way cheaper than satellites. Of course, satellites have other uses than just global temps. But we shouldn't put extras up there just for the sake of getting more accurate global temps. We can get far more accurate global temps for much cheaper by standardizing and raising the bar for our surface measurements.

Link to comment
Share on other sites

The surface temperature dataset is being brought into question because Jonger believes they are biased warm. You need not waste time with a serious response. Deniers are a waste of space.

 

Keep the garbage in the banter thread please. It is actually a legit discussion on why satellites are not as reliable as one might think and what can be done to improve the surface data.

Link to comment
Share on other sites

I circled Nov 97 on the latest UAH6 chart. One month before the Dec-Feb lift off.

 

attachicon.gifUAH_LT_1979_thru_November_2015_v6.png

 

This will be a good test for UAH v6.  If the temperature doesn't spike above 1998 levels, one has to further question the orbital drift correction.  As it already stands,  many are very skeptical of the dataset already due to the lagging peer review and the massive trend reduction from previous versions.

Link to comment
Share on other sites

Yes, the easiest thing to do is to put in a network of pristine weather stations...which has already been done in the United States. It is called "USCRN". The beauty of that network is there is no need for adjustments that can (and have been in the past) be subjective out of necessity.

 

USCRN has only been collecting data for about a decade though so it isn't very useful yet for climate trends. There really needs to be one developed on a global basis, as well as the same for SST anomalies which have been subjected to massive adjustments (see the change in ERSSTv4 earlier this year that made headlines everywhere).

 

All of this would be way cheaper than satellites. Of course, satellites have other uses than just global temps. But we shouldn't put extras up there just for the sake of getting more accurate global temps. We can get far more accurate global temps for much cheaper by standardizing and raising the bar for our surface measurements.

 

I'm absolutely in favor of more satellites and including more instruments on each platform.  For one, the more platforms available the more we can move to continuous coverage over as much area as possible.  As it stands, we're currently not putting enough money into our satellite systems, IMO.

Link to comment
Share on other sites

I'm absolutely in favor of more satellites and including more instruments on each platform.  For one, the more platforms available the more we can move to continuous coverage over as much area as possible.  As it stands, we're currently not putting enough money into our satellite systems, IMO.

 

The cost is ridiculous compared to the product you get in return if your main goal is global temperatures.

 

 

Obviously satellites are very good for other uses than measuring global temps, so you can put multi-purpose units up there. But until it becomes cheaper, it's going to be difficult to get the redundancy necessary to get an awesome satellite dataset for global temps. They are okay right now, but obviously there's limitations.

 

edit: I should probably clarify I'm talking in relative terms with the cost. I 100% agree that more data is better when it comes to the science of temperature datasets...whether it is satellites, raobs, pristine sfc stations, etc.

Link to comment
Share on other sites

This will be a good test for UAH v6. If the temperature doesn't spike above 1998 levels, one has to further question the orbital drift correction. As it already stands, many are very skeptical of the dataset already due to the lagging peer review and the massive trend reduction from previous versions.

As I posted earlier, it should spike .25C-.30C above 1998 given the expected background warming in the last 17 years. Simply being at 1998 levels would not be impressive.

It is definitely possible the plateau in warming will make it difficult for the satellites to peak in the 0.9C range.

Link to comment
Share on other sites

Here is the RSS total precipitable water (TPW) product from satellite microwave through Nov 2015.  As might be expected, the current nino has spiked TPW to record levels with the biggest increase in the tropics.  Compared to satellite temperature, water vapor has less ENSO lag and a clearer upward trend with time. Another indication that satellite troposphere temperature is probably underestimating warming. In any case, RSS/UAH temperature should follow TPW with a Nino spike of similar magnitude to 97/98.

 

http://www.remss.com/measurements/atmospheric-water-vapor

 

post-1201-0-91072900-1449839582_thumb.pn

Link to comment
Share on other sites

As I posted earlier, it should spike .25C-.30C above 1998 given the expected background warming in the last 17 years. Simply being at 1998 levels would not be impressive.

It is definitely possible the plateau in warming will make it difficult for the satellites to peak in the 0.9C range.

 

You mean the unreliability of the measurement will make it difficult for the satellites to peak in the .9C range. (Depending on data source - some data sources will definitely be in the .9C range).

 

Why waste time with satellite measurements? It's like trying to take the internal temperature measurement of the turkey in the oven by using binoculars from your living room to see how gold brown it is and then performing calculations to estimate the internal temperature from how crispy it is. Or you could just stick a good old fashioned thermometer in it. 175.0F and done.

 

Satellites have a track record of unreliability.

Link to comment
Share on other sites

The problem is there is very little repetition in satellite data.  If a dozen GHCN v3 stations are screwy, it barely effects the overall trend/dataset on GISS or HadCrut4.  If one spectrometer is screwy on an orbital remote sensing device, it can absolutely cause massive errors that need to be corrected in imprecise ways.  In order to provide less uncertainty in satellite data, NASA (or any other space program) would have to spend billions of dollars to launch more.

 

We could launch a few redundant networks and just average them all out.

Link to comment
Share on other sites

Here is the RSS total precipitable water (TPW) product from satellite microwave through Nov 2015.  As might be expected, the current nino has spiked TPW to record levels with the biggest increase in the tropics.  Compared to satellite temperature, water vapor has less ENSO lag and a clearer upward trend with time. Another indication that satellite troposphere temperature is probably underestimating warming. In any case, RSS/UAH temperature should follow TPW with a Nino spike of similar magnitude to 97/98.

 

http://www.remss.com/measurements/atmospheric-water-vapor

 

attachicon.giftpwrss.png

 

 

It seems like it was edging up even before the El Nino.   But interesting graph regardless, i'd like to see how much it changes during La Nina.

Link to comment
Share on other sites

Doubling Down on Our Faustian Bargain

 

Humanity's Faustian climate bargain is well known. Humans have been pumping both greenhouse gases (mainly CO2) and aerosols (fine particles) into the atmosphere for more than a century. The CO2 accumulates steadily, staying in the climate system for millennia, with a continuously increasing warming effect. Aerosols have a cooling effect (by reducing solar heating of the ground) that depends on the rate that we pump aerosols into the air, because they fall out after about five days.

 

Aerosol cooling probably reduced global warming by about half over the past century, but the amount is uncertain because global aerosols and their effect on clouds are not measured accurately. Aerosols increased rapidly after World War II as fossil fuel use increased ~5 percent/year with little pollution control (Fig. 1). Aerosol growth slowed in the 1970s with pollution controls in the U.S. and Europe, but accelerated again after ~2000.

2013-03-31-ScreenShot20130331at4.09.38PM

2013-03-31-ScreenShot20130331at4.14.12PM

The rapid growth of fossil fuel CO2 emissions in the past decade is mainly from increased coal use (Fig. 1), mostly in China with little control of aerosol emissions. It is thus likely that there has been an increase in the negative (cooling) climate forcing by aerosols in the past decade, as suggested by regional aerosols measurements in the Far East, but until proper global aerosol monitoring is initiated, as discussed below, the aerosol portion of the amplified Faustian bargain remains largely unquantified.

 

In our current paper we describe another component to the fossil fuel Faustian bargain, which is 
suggested by a careful look at observed atmospheric CO2 change (Fig. 2). The orange curve in Fig. 2 is the 12-month change of CO2 at Mauna Loa. This curve is quite "noisy," in part because it has double noise, being affected by short-term variability at both the start-point and end-point in taking the 12-month difference in CO2 amount. A more meaningful measure of the CO2 growth is provided by the 12-month running mean (red curve in Fig. 2). The temporal variability of the red curve has physical significance, most of the variability being accounted for by the Southern (El Nino-La Nina) Oscillation and the Pinatubo volcanic eruption in the early 1990s, as discussed in our paper.

 

NOAA recently reported the second largest annual CO2 increase in their Mauna Loa record. What they report is the end-of-year change in the noisy orange curve, the end-of-year values being indicated by blue asterisks in Fig. 2. It is practically certain that still larger CO2 increases will soon be reported, because of the huge increase of the rate of fossil fuel CO2 emissions in the past decade (black curve in Fig. 1), indeed we must expect reports of annual CO2 increases exceeding 3 ppm CO2.

 

Independent of a possible aerosol effect on the carbon cycle, it is known that aerosols are an
important climate forcing. IPCC17 concludes that aerosols are a negative (cooling) forcing, probably between -0.5 and -2.5 W/m2. Hansen et al., based mainly on analysis of Earth's energy imbalance, derive an aerosol forcing -1.6 ± 0.3 W/m2, consistent with an analysis of Murphy et al. that suggests an aerosol forcing about -1.5 W/m2. This large negative aerosol forcing reduces the net climate forcing of 
the past century by about half.

 

Reduction of the net human-made climate forcing by aerosols has been described as a "Faustian 
bargain," because the aerosols constitute deleterious particulate air pollution. Reduction of the net climate forcing by half will continue only if we allow air pollution to build up to greater and greater amounts. More likely, humanity will demand and achieve a reduction of particulate air pollution, whereupon, because the CO2 from fossil fuel burning remains in the surface climate system for millennia, the "devil's payment" will be extracted from humanity via increased global warming.

 

So is the new data we present here good news or bad news, and how does it alter the "Faustian 
bargain"? At first glance there seems to be some good news. First, if our interpretation of the data is correct, the surge of fossil fuel emissions, especially from coal burning, along with the increasing atmospheric CO2 level is "fertilizing" the biosphere, and thus limiting the growth of atmospheric CO2. Also, despite the absence of accurate global aerosol measurements, it seems that the aerosol cooling effect is probably increasing based on evidence of aerosol increases in the Far East.

 

Both effects work to limit global warming and thus help explain why the rate of global warming 
seems to be less this decade than it has been during the prior quarter century. This data interpretation also helps explain why multiple warnings that some carbon sinks are "drying up" and could even become carbon sources, e.g., boreal forests infested by pine bark beetles and the Amazon rain forest suffering from drought, have not produced an obvious impact on atmospheric CO2.

 

However, increased CO2 uptake does not necessarily mean that the biosphere is healthier or that the increased carbon uptake will continue indefinitely. Nor does it change the basic facts about the potential magnitude of the fossil fuel carbon source and the long lifetime of fossil fuel CO2 in the surface carbon reservoirs (atmosphere, ocean, soil, biosphere) once the fossil fuels are burned. Fertilization of the biosphere affects the distribution of the fossil fuel carbon among these reservoirs, at least on the short run, but it does not alter the fact that the fossil carbon will remain in these reservoirs for millennia.

 

The principal implication of our present analysis relates to the Faustian bargain. Increased short-term masking of greenhouse gas warming by fossil fuel particulate and nitrogen pollution is a "doubling down" of the Faustian bargain, an increase in the stakes. The more we allow the Faustian debt to build, the more unmanageable the eventual consequences will be. Yet globally there are plans to build more than 1,000 coal-fired power plants and plans to develop some of the dirtiest oil sources on the planet. These plans should be vigorously resisted. We are already in a deep hole -- it is time to stop digging.

 

The tragedy of this science story is that the great uncertainty in interpretations of the climate
forcings did not have to be. Global aerosol properties should be monitored to high precision, similar to the way CO2 is monitored. The capability of measuring detailed aerosol properties has long existed, as demonstrated by observations of Venus. The requirement is measurement of the polarization of reflected sunlight to an accuracy of 0.1 percent, with measurements covering the spectral range from near ultraviolet to the near-infrared at a range of scattering angles, as is possible from an orbiting satellite. Unfortunately, the satellite mission designed for that purpose failed to achieve orbit, suffering precisely the same launch failure as the Orbiting Carbon Observatory (OCO). Although a replacement OCO mission is in preparation, no replacement aerosol mission is scheduled.

http://www.huffingtonpost.com/dr-james-hansen/doubling-down-on-our-faustian-bargain_b_2989535.html

Link to comment
Share on other sites

GHG forcing through the roof, impossible to explain this away with natural variability. In addition, the Western Pacific SSTA is usually below average during strong el nino events. Regardless, you can find some similarity. The Northern PAC is simply nothing like anything we've seen.

 

Here is 97 and 09...

 

Dec1997-2009wScale.jpg

 

post-8708-0-74832600-1450199787_thumb.gi

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...