Jump to content
  • Member Statistics

    17,606
    Total Members
    7,904
    Most Online
    ArlyDude
    Newest Member
    ArlyDude
    Joined

Numerical weather prediction infrastructure


Recommended Posts

For as long as I have posted here and previously at Eastern, I’ve advocated Met jobs on the periphery of operational meteorology (research, technology, remote sensing, academia, etc…) as a way to get into the field or enter the field at a more advantageous salary structure. Over in the Mid-Atlantic forum, there is a (really great) “Met Class” thread where members are encouraged to ask basic meteorology questions and get easy-to-understand answers from qualified professional forecasters and meteorologists. One of the questions that came up had to do with the kind of computers used to process numerical models. Here’s the link:

http://www.americanw...ost__p__1396286

I thought I might open up the discussion to a larger audience since there is sometimes some interest in the meteorological disciplines that straddle the met/tech fence.

There are some exciting things going on in the numerical weather prediction processing arena these days. Right now, NCDC models are processed on IBM Power 575 supercomputer systems with 70-teraflop (trillion calculations per second) capacity. The primary (Stratus) and its backup (Cirrus) allow for NWP calculations to occur at least 4 times faster than the predecessor systems with the implicit goal of ingesting more data thereby improving forecast accuracy and extending watch and warning lead times for severe weather. They went in a couple of years ago.

For the hardware geeks out there, here’s a pic of Stratus up in Gaithersburg in all its sterile data center glory. Cirrus is out in a NASA facility in WV.

http://www.noaanews....-banks-noaa.jpg

Posters like OhLeary and dtk can probably shed more light on the in-situ infrastructure if they are so inclined.

Knowing that increasing processing power will help with ingest/ops/output but also knowing that these systems are hugely expensive (Stratus cost $180m), NOAA has recently partnered with DOE for 10 million hours of computational time over the next 3 years on their beast of a machine, the massively parallel GAEA housed at ORNL. It’s an 1106 teraflop (1.1 petaflop) Cray XT6 and XE6 aggregate system. For the hardware weenies, here’s the spec: 5,152 AMD 16-core Opteron 6200 series processors (82,432 cores!) with 4 memory channels per processor. The system has 248 terabytes of memory and 3.6 petabytes of DASD with I/O of 10GB/sec and with 1petabyte of that in fast I/O of 48GB/sec for scratch space calculations. NOAA expects the infrastructure to allow their researchers to develop and refine climate models, process finer resolution models and do so more quickly. To put in perspective how quickly processing power is increasing relative to cost, the conglomerated GAEA cost $73m, less than half of Stratus/Cirrus only two years ago (although that $180m was for spread out over a 9 year deployment schedule). That said, the research model runs and development of prototype high-rez coupled models that Earth Systems Modeling branch at GFDL will be building to run on the new hardware are research-only at this point and will not be used for ops. Instead, they will provide the framework for the near generation modeling systems that will support operational meteorology in the near future.

The nextgen exascale (1,000 petaflops) computing platforms are in the works now. If you are wondering where the $$ in the FY’13 reduced budget is going, look no further than the $126m that the Administration is spending on exascale computing starting this year. The outlay is based on some of the scientific prospects and expected benefits detailed here:

http://www.nccs.gov/...20v3__final.pdf

And here:

http://www.exascale....e&printable=yes (look at the “notable products of recent workshops” link for the strategic directions for weather, climatology and earth systems for the proposed exascale systems).

When I have time, I’ll draft up a “status of the software stack” similar to this hardware-focused post. The amount of research going into modeling process and software development right now by governments across the world, academia and private industry is absolutely staggering.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...