Jump to content
  • Member Statistics

    17,611
    Total Members
    7,904
    Most Online
    NH8550
    Newest Member
    NH8550
    Joined

2014 Global Temperatures


StudentOfClimatology

Recommended Posts

  • Replies 2.3k
  • Created
  • Last Reply

CFS has shot back up near 0.40C+ on the dailies. Up to 0.134C+ on the monthly.

Looks like another 0.30-0.40C+ day tomorrow. Then a slow drop back to around 0.10-0.15C by the 15th.

The monthly will likely be around 0.175C+ around the 15th. Which equals a 0.73C+ on GISS.

Looks like October will turn out relatively cool based on the extrapolation of the 00z intervals...we drop about 0.25C over the next 4-5 days, then sit there for awhile: http://cci-reanalyzer.org/Forecasts/

This would fit in with the seasonally-adjusted CERES imbalance for September of +1.68W/m^2, meaning we're emitting more than we're recieving relative to the seasonal means: http://ceres-tool.larc.nasa.gov/ord-tool/jsp/EBAFSelection.jsp

Link to comment
Share on other sites

It's going to take the rest of the month of relatively cool to get a cool October and then some.

 

Currently CFS converts to a .69C+ on GISS.  About 30% of the month is in the can.

 

That will likely reach 0.72 to 0.73C+ by the 12th-13th and sit around there or slowly drop with CFS running around 0.1C+ into the 15th.

 

 

So maybe it's down to about 0.71C+ by the 15th-16th.

 

 

To get a GISS equivalent of .60C+.  The second half of October will need to pull about a -0.05C.

 

Which I would say is very very unlikely.

Link to comment
Share on other sites

It's going to take the rest of the month of relatively cool to get a cool October and then some.

Currently CFS converts to a .69C+ on GISS. About 30% of the month is in the can.That will likely reach 0.72 to 0.73C+ by the 12th-13th and sit around there or slowly drop with CFS running around 0.1C+ into the 15th.

Okay, but we're not hovering, we're losing about 0.30C over the next 5+ days, back to where we were in early October. http://cci-reanalyzer.org/Forecasts/

Furthermore, the CERES data a points to a positive radiative imbalance for September. Global temps usually lag CERES by about 4-8 weeks, so a general cooling over this timespan seems more likely than not, IMO.

Link to comment
Share on other sites

Pretty big ENSO flip taking place.

 

Definitely going to promote warming in both ENSO 1-2/3&4.

 

There will and looks to already be warming taking place in the Indian Ocean as well.

 

 

 

 

 

 

 

 

That isn't a really strong WWB....it will help it warm a bit in 3.4....but you want to see that a lot larger in coverage and intensity to really ramp things up. We are barely holding onto enough upper ocean heat content to produce a weak El Nino right now.

Link to comment
Share on other sites

It won't take much at this point to break yearly records and set up 2015 to do it again on all data sets.

It's likely going to take more than 0.3C in Nino 3.4 to get it done. Modeling is progging a large increase in Eurasian troughing/snowcover & much colder thermals over the GOA & N-PAC next week, as SSTs drop. With the +AAO holding, a new "hotspot" will have to pop up somewhere

Link to comment
Share on other sites

Things are setting up for a warm November as well as a run at 2005 on UAH for 3rd warmest year on it's record.

Currently UAH is running at roughly 0.26C+ for the year so far with 3 months to go.

2005 came in at 0.29C+.

I could see the satellite datasets warming as heat/latent heat is advected from the oceans into the atmosphere, yes.

Link to comment
Share on other sites

Here's the GISS image. Warm just about everywhere, with the highest anomalies in the Antarctic:

640.jpg

 

 

 

 

 

Is there a legitimate reason why they've 'coincidentally' chosen the coldest 30-year block of time in the past 85 years as the baseline? To the average on-looker, it can be deceptive. It's akin to comparing NYC's winter DJF temperature departures against their 25-30 coldest winters on record -- guess what, the majority of years will probably look warm.

Link to comment
Share on other sites

Is there a legitimate reason why they've 'coincidentally' chosen the coldest 30-year block of time in the past 85 years as the baseline? To the average on-looker, it can be deceptive. It's akin to comparing NYC's winter DJF temperature departures against their 25-30 coldest winters on record -- guess what, the majority of years will probably look warm.

 

 

They started the GISS temp tracking in the 1980s....so naturally 1951-1980 was the latest 30 year period. It was coincidence.

 

 

It makes no difference when looking at trends...the only difference it makes is the raw anomaly.

Link to comment
Share on other sites

They started the GISS temp tracking in the 1980s....so naturally 1951-1980 was the latest 30 year period. It was coincidence.

 

 

It makes no difference when looking at trends...the only difference it makes is the raw anomaly.

 

 

 

Makes sense, though wouldn't it make more sense to use the 1951-2010 period (a time frame encompassing more decadal variability)? Then we would more easily see the temperature trend versus a couple of AMO/PDO cycles.

 

Agree the trend wouldn't change, but in terms of someone looking at the above map, if the 1951-2010 period were used, there would likely be more normal or cool areas showing up, and provides a more objective look in my opinion.

Link to comment
Share on other sites

Just to illustrate my point about the appearance to an on-looker, in terms of anomaly, here's winters since 2008 versus the 1960-1990 baseline (probably the coldest 30 year block):

 

21l7lua.png

 

 

And then against the 1950-2010 block.

 

The sign of departures changes for a large expanse of real estate. Fairly significant difference to an on-looker.

 

 

1zmh5rd.png

Link to comment
Share on other sites

How is 0.03C a "good amount"? That's within the published margin error, dude.

 

A quick analysis shows you that previous records in the last 50 years are broken by an average value of 0.016C.  Thus this record is almost broken by more than twice of the average of a record-breaking month.  The margin of error on any given month is very high (nearly 10%) and changes can/will be made to data that will affect the magnitude of the record....

 

dude.

Link to comment
Share on other sites

A statistical analysis with consolidated probabilities cannot be used when analyzing spatiotemporally-governed anomalies. You should know this.

The "average" you seek is an ever changing myth because is has no preference or basis. It is unobserved on the resolution we can work with.

I agree somewhat, the timescales are too short for all sides of the argument. Global temps are warming out of the background for sure tho. We've broken many monthly records thus far.

Link to comment
Share on other sites

I agree somewhat, the timescales are too short for all sides of the argument. Global temps are warming out of the background for sure tho. We've broken many monthly records thus far.

Of course they're warming, and will continue to do so. That's beside the point, though. Objectivity must be maintained in order to limit mistakes.

We all have biases...the best scientists are able to recognize their own, opening a floodgate of reality which can then be processed and formulated into a better understanding of physical truth.

Link to comment
Share on other sites

A statistical analysis with consolidated probabilities cannot be used when analyzing spatiotemporally-governed anomalies. You should know this.

The "average" you seek is an ever changing myth because is has no preference.

So a statistical parameter is a myth?  These are numbers, any simple SPS or excel analysis can be completed on them with a defined uncertainty.  

 

You realized, based on your statement above, there is no point in defining a baseline because "averages" are a myth with temporally defined anomalies?  It makes not an once of sense.

 

But regardless, I'm not sure why you want to attempt to down play the global temperature record with a semantical argument. 

Link to comment
Share on other sites

A statistical analysis with consolidated probabilities cannot be used when analyzing spatiotemporally-governed anomalies. You should know this.

The "average" you seek is an ever changing myth because is has no preference or basis. It is unobserved on the resolution we can work with.

 

I don't think this is correct at all unless you show the error is biased.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...