Jump to content
  • Member Statistics

    17,609
    Total Members
    7,904
    Most Online
    NH8550
    Newest Member
    NH8550
    Joined

Lou Uccellini promises big US model upgrades


Recommended Posts

Yeah, so curiously enough, NBC News ran a segment on the relative deficiency of NCEP (USA! USA! USA!) models not long ago:

http://www.nbcnews.com/video/nightly-news/51108647/#51108647

I suppose when you've got Al Roker of all people grilling Lou Uccellini about why our weather models suck so bad, it's time to put some money into those computers :P

Link to comment
Share on other sites

That was brutal--when I saw Al Roker I wanted to skip the clip. Nonetheless, I hope Uccellini does something. Just doesn't seem like priority to US at this time. The co-author of NE Snowstorms has got to bring more gravitas to this issue. Of course, within a national debate about spending on such issues as White House public visits. Oh well, I guess we keep paying for the Euro

Link to comment
Share on other sites

Hmmm, I didn't see the word "promise" in there anywhere. I hope he really means it, because our models suck! lol.

Anyway, I'm tired of hearing how good the ECMWF did wrt to Sandy compared to the GFS and that one event being used as the ultimate measuring stick btw the two models. Folks may want to revisit the modeling of TS Debby and add that to their 'scientific' model analysis.

Link to comment
Share on other sites

Anyway, I'm tired of hearing how good the ECMWF did wrt to Sandy compared to the GFS and that one event being used as the ultimate measuring stick btw the two models. Folks may want to revisit the modeling of TS Debby and add that to their 'scientific' model analysis.

 

This! Also the GFS and the NCEP model suite has done rather well across the West and Plains this Winter and the constant bashing of how inferior the American global computer guidance is that has been posted by weenies in some sub forums is laughable at best.

Link to comment
Share on other sites

This! Also the GFS and the NCEP model suite has done rather well across the West and Plains this Winter and the constant bashing of how inferior the American global computer guidance is that has been posted by weenies in some sub forums is laughable at best.

 

The GFS has its biases/weakness just like any other piece of model guidance.  I'd like to see NCEP can the 06/18z in favor of better data assimilation and a smaller vertical/horizontal grid.

Link to comment
Share on other sites

Yeah, so curiously enough, NBC News ran a segment on the relative deficiency of NCEP (USA! USA! USA!) models not long ago:

http://www.nbcnews.com/video/nightly-news/51108647/#51108647

I suppose when you've got Al Roker of all people grilling Lou Uccellini about why our weather models suck so bad, it's time to put some money into those computers :P

Link to comment
Share on other sites

Is it possible to upgrade the GFS/NAM to use the same 4dvar data assimilations that the ECMWF and GGEM use? or would we need to develop a completely new model?

No.  We don't have the TL (linear) and AD (adjoint/transpose) versions of the models required to do this.  There was work being done in this regard for the GFS, but the funding was pulled a while ago. 

 

We have already implemented a hybrid EnVar algorithm for the GFS/GDAS, which has a 4D extension (this is quite the same as 4DVAR, since it uses ensemble perturbations instead of a dynamic model for temporal aspects of the solution).  This is the same paradigm that both the UKMO and Canadians have embraced and are similarly working toward in parallel to NCEP.

Link to comment
Share on other sites

The GFS has its biases/weakness just like any other piece of model guidance.  I'd like to see NCEP can the 06/18z in favor of better data assimilation and a smaller vertical/horizontal grid.

That's the thing, this isn't a zero sum game.  We can't simply take cycles from 06z/18z to run things faster at 00z/12z.  We have insanely strict time requirements for getting our products out, and there is no turning back now (customers would never let us delay things).  Also, we have to run those cycles, if nothing else but to provide boundary conditions for the 8 billion downstream models that our customers request/require/crave in addition to the cycling required for DA.

 

Also, what "better data assimilation" are you talking about?

Link to comment
Share on other sites

That's the thing, this isn't a zero sum game.  We can't simply take cycles from 06z/18z to run things faster at 00z/12z.  We have insanely strict time requirements for getting our products out, and there is no turning back now (customers would never let us delay things).  Also, we have to run those cycles, if nothing else but to provide boundary conditions for the 8 billion downstream models that our customers request/require/crave in addition to the cycling required for DA.

 

Also, what "better data assimilation" are you talking about?

 

I made no reference to running the GFS faster.  I just don't see the need for 4 model cycles.  2 should suffice.  As far as "better data assimilation" goes, I'd like to see a higher overall grid resolution.  It looks as though this is already in the works.

 

http://www.emc.ncep.noaa.gov/GFS/rd.php

Link to comment
Share on other sites

I made no reference to running the GFS faster.  I just don't see the need for 4 model cycles.  2 should suffice.  As far as "better data assimilation" goes, I'd like to see a higher overall grid resolution.  It looks as though this is already in the works.

 

http://www.emc.ncep.noaa.gov/GFS/rd.php

You did say:  "I'd like to see NCEP can the 06/18z in favor of better data assimilation and a smaller vertical/horizontal grid."

 

Ridding of those two cycles does nothing to improve the data assimilation or allow for higher resolution.  That was my point.

 

And yes, work is already underway for a significant upgrade to the GFS, including a substantial increase to the horizontal resolution of the model (and corresponding assimilation).  First, we need the new supercomputer to go live (scheduled for August), then we can work on implementing new technology.  I can't give details nor a specific timeline since it is all still preliminary.

 

In terms of changing the vertical resolution, that is also in the works but quite a ways away.  Changing the model levels is a huge undertaking, since it has such a profound impact on how the physical parameterizations behave (on the model side) and how the assimilation of satellite data works (on the initialization side).  ECMWF has been working on moving to a 137 layer version of their model for years (and it is still not operational).

Link to comment
Share on other sites

I'd like to one day see the GFS go toe to toe with the ecmwf on model verification.  It must be nice for NCEP to adopt the 4DVAR data assimilation.  I realize that the massive private funding and the computing power of the ecmwf people may make this hope unreasonable.

First, it is something we all want, but look at the last 25+ years:

http://www.emc.ncep.noaa.gov/gmb/STATS/html/aczrnmn4.html

 

This is not something new.  Again, not only do they have faster/larger computers, they have a much more focused (i.e. smaller) mission than does NWS/NCEP.  I encourage you to check out this blog post by Rick Rood:

http://www.washingtonpost.com/blogs/capital-weather-gang/post/to-be-the-best-in-weather-forecasting-why-europe-is-beating-the-us/2013/03/08/429bfcd0-8806-11e2-9d71-f0feafdd1394_blog.html

 

It highlights nicely some things that have to change on a programmatic level.  More computing is only part of the solution.

 

As I mentioned in a previous post, NCEP is aggressively pursuing extensions of something called (hybrid) EnVar (and as I understand it, so is Environment Canada and the UK Met Office).  There are several reasons for this, namely the maintenance of TL/AD versions of the dynamic models, but also in an attempt to keep up with the scalability as computer technology continues to rapidly advance.  EnVar has a 4D extension to it, and several experiments and studies have demonstrated that this paradigm can be competitive with 4DVAR (and hybrid variants thereof).

Link to comment
Share on other sites

I made no reference to running the GFS faster.  I just don't see the need for 4 model cycles.  2 should suffice.  As far as "better data assimilation" goes, I'd like to see a higher overall grid resolution.  It looks as though this is already in the works.

 

http://www.emc.ncep.noaa.gov/GFS/rd.php

 

The effort to stop disseminating a few files from the GFS that are ancient MRF lookalikes takes quite an effort.  To ditch two complete cycles of the GFS and all its output...now that's funny...and impossible in this or any universe.  As dtk said, it'd mean a complete overhaul of the NCEP production suite and the ditching of many other model runs and cycles. 

Link to comment
Share on other sites

First, it is something we all want, but look at the last 25+ years:

http://www.emc.ncep.noaa.gov/gmb/STATS/html/aczrnmn4.html

 

This is not something new.  Again, not only do they have faster/larger computers, they have a much more focused (i.e. smaller) mission than does NWS/NCEP.  I encourage you to check out this blog post by Rick Rood:

http://www.washingtonpost.com/blogs/capital-weather-gang/post/to-be-the-best-in-weather-forecasting-why-europe-is-beating-the-us/2013/03/08/429bfcd0-8806-11e2-9d71-f0feafdd1394_blog.html

 

It highlights nicely some things that have to change on a programmatic level.  More computing is only part of the solution.

 

As I mentioned in a previous post, NCEP is aggressively pursuing extensions of something called (hybrid) EnVar (and as I understand it, so is Environment Canada and the UK Met Office).  There are several reasons for this, namely the maintenance of TL/AD versions of the dynamic models, but also in an attempt to keep up with the scalability as computer technology continues to rapidly advance.  EnVar has a 4D extension to it, and several experiments and studies have demonstrated that this paradigm can be competitive with 4DVAR (and hybrid variants thereof).

 

Dtk, yeah you're not going to hear from me the weenie nonsense about "are the models really better?".  Both the gfs and ec have made great strides in the last 20 years.  However, it's always that contest where you're getting ybetter but your competition continues to be a bit better than you and you can't close the gap.

Link to comment
Share on other sites

Those stats aren't operationally significant. It's funny that some folks believe they are, but when it comes down to daily sensible wx forecasting, the op models each have their own outlying moments.

 

I'd bet they are.  The tv mets who rip and read the gfs do noticeably poor.  You'd definitely have better forecasts if you ripped the EC at day 3 to 5.  Obviously a blend with pattern recogntion and local climo knowledge is going to give you the best forecast.

Link to comment
Share on other sites

I'd bet they are.  The tv mets who rip and read the gfs do noticeably poor.  You'd definitely have better forecasts if you ripped the EC at day 3 to 5.  Obviously a blend with pattern recogntion and local climo knowledge is going to give you the best forecast.

No they aren't. If you took a random model run and modelcasted at a random time...either the gfs or the ecmwf would have the same chance of being the worse outlier from reality. They go back and forth and I've never seen one model consistently do better than another, esp in the cold season.

Link to comment
Share on other sites

No they aren't. If you took a random model run and modelcasted at a random time...either the gfs or the ecmwf would have the same chance of being the worse outlier from reality. They go back and forth and I've never seen one model consistently do better than another, esp in the cold season.

I don't agree with this.  I think the differences are noticeable.

Link to comment
Share on other sites

I don't agree with this.  I think the differences are noticeable.

That's fine, you don't have to agree with me. I'm just saying that has been my experience throughout my 16 years of operational forecasting for NOAA and the Air Force.

Link to comment
Share on other sites

That's fine, you don't have to agree with me. I'm just saying that has been my experience throughout my 16 years of operational forecasting for NOAA and the Air Force.

We have plenty of other metrics to quantify the discrepency in skill, and in a mean sense for many metrics, the EC is better (there is no debate).  However, the gap isn't nearly as large as people seem to think it is.  Also, as you say, this does not guarantee that either model will be "better" for any discrete (localized) event.

Link to comment
Share on other sites

That's fine, you don't have to agree with me. I'm just saying that has been my experience throughout my 16 years of operational forecasting for NOAA and the Air Force.

 

i happen to agree here. this coming from a guy that has seen models of some form since the days of the NGM, the baroclinic, the LFM, the old AVN, etc (started college in late August of 1992). the models in general have gotten much better over the years. and there are storms in which the GFS had the advantage, but most of them in the longer range have been the ECMWF on top. and last i knew the Canadian regional model was running full-blown 4dvar last spring/summer while the full-blown 4dvar is in the global run now. And the advantage where the US short-range models were much better at convection than the canadian regional models has now closed significantly operationally. let's hope they get that new hardware going on time. worst part is, we have computers that could allow us to run 4dvar easily within the US government right now. only problem, they're being used for modeling nuclear weapons in places like the department of energy. i know having the "best 'motor' in the world" won't solve all the problems as all the other parts all need to be able to harness the power. but it sure does help if you have raw power to spare compared to competition. but as for global and regional models ever getting up to snuff to totally replace people like us forecasters is still a ways off.

Link to comment
Share on other sites

i happen to agree here. this coming from a guy that has seen models of some form since the days of the NGM, the baroclinic, the LFM, the old AVN, etc (started college in late August of 1992). the models in general have gotten much better over the years. and there are storms in which the GFS had the advantage, but most of them in the longer range have been the ECMWF on top. and last i knew the Canadian regional model was running full-blown 4dvar last spring/summer while the full-blown 4dvar is in the global run now. And the advantage where the US short-range models were much better at convection than the canadian regional models has now closed significantly operationally. let's hope they get that new hardware going on time. worst part is, we have computers that could allow us to run 4dvar easily within the US government right now. only problem, they're being used for modeling nuclear weapons in places like the department of energy. i know having the "best 'motor' in the world" won't solve all the problems as all the other parts all need to be able to harness the power. but it sure does help if you have raw power to spare compared to competition. but as for global and regional models ever getting up to snuff to totally replace people like us forecasters is still a ways off.

The Canadians have been running 4DVAR in their global for some time now.  I've said this before, and I'll say it a million times more, although 4DVAR has been the "gold standard" in assimilation for some time, it is not the be all-end all.  Furthermore, it is NOT just a matter of computing in terms of being able to run a 4DVAR, as there is a huge development and maintenance cost since you need to have available the linearized and transpose (adjoint) versions of the model.  There are also issues of scalability.  For these latter two reasons, centers such as NCEP, UKMO, and Env. Canada are moving toward other paradigms, namely EnVar (i.e. "4D" assimilation algorithms that do not require the linearized and adjoint models).

Link to comment
Share on other sites

  • 4 weeks later...

Spending bajillions of dollars on climate supercomputers while weather computing power lags behind is total bullsh*t that needs to stop. I think we already have climate computing well covered...let's worry about tornado outbreaks and hurricanes 3 days from now. 

Link to comment
Share on other sites

An interesting thread to read. I, for one, love having 4 cycles of the GFS/NAM. For dprog/dt within day 4, especially for major storm events, it is crucial to have those extra runs. With respect to data assimilation, dtk is spot on (as I would expect him to be) with respect to 4d-var not being the be all-end all that some think it is. The Canadian global is a great example of that. Statistics aside, from a usabilty standpoint, the GFS is substantially better than the Canadian. Finally, with respect the ECMWF, the differences are not as dramatically different than as some believe. I am routinely impressed by the overall performance of the GFS compared to the ECMWF with substantially less resources available to them (NCEP). 

Link to comment
Share on other sites

The Canadians have been running 4DVAR in their global for some time now.  I've said this before, and I'll say it a million times more, although 4DVAR has been the "gold standard" in assimilation for some time, it is not the be all-end all.  Furthermore, it is NOT just a matter of computing in terms of being able to run a 4DVAR, as there is a huge development and maintenance cost since you need to have available the linearized and transpose (adjoint) versions of the model.  There are also issues of scalability.  For these latter two reasons, centers such as NCEP, UKMO, and Env. Canada are moving toward other paradigms, namely EnVar (i.e. "4D" assimilation algorithms that do not require the linearized and adjoint models).

 

Seriously, The Euro has not been in a class by itself this whole cool season since Sandy. Granted this a imfby post. It has a perpetual south bias with southern stream systems and the gfs has done better than the ec with both backdoors the past two weeks. Even with Sandy people forget the Delmarva slams (what would have Bloomberg done with that?). That being said, there is no denying what has already been posted, its just closer than what people (paging Cliff Mass) think.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...