Jump to content
  • Member Statistics

    17,613
    Total Members
    7,904
    Most Online
    RyRyB
    Newest Member
    RyRyB
    Joined

AmericanWx WRF


Recommended Posts

This. Make sure to have a selective-variable interface with regional subsetting though, maybe with the option to download lower-res GRIB files in addition to the full ones (since sites making 800x600 CONUS maps don't need much better than 10km resolution). Point soundings could be achieved with GrADS, though starting a new instance (necessary for every request of a dynamically-generated sounding) on a 2 GHz web server takes an entire second (in webspeak, that's a very long time to be generating content per visit unless you have a multicore server). Note that actually generating the sounding takes much shorter than this, so pre-determined points (like stations) could be generated en masse at a much faster rate than this.

Make sure to include at least the entire rectangle from 22N to 53N, 129W to 64W. Depending on how the model works, you may want to expand this to 20N to 60N, 140W to 60W (to get some sampling on systems from important areas... the models can't forecast an already-existing system that they never saw to begin with).

Also, check out the list of variables I posted earlier in the thread (in addition to those already in the first post).

What dataset will you be using to initialize the model?

You can do point sounding in WG now with any model that WG ingest.

Link to comment
Share on other sites

  • Replies 65
  • Created
  • Last Reply

I'm sure the severe weather crew would appreciate LCL/LFC heights and 0-3 km CAPE added to the list.

Yes, yes, yes, especially to the last one.

Also, I think 700-500mb lapse rates would be most useful.

The last thing I would suggest, and I know this one could be hard, would be significant tornado parameter (the original one). I find this one to be useful in reinforcing ideas especially on cold-season tornado events.

Link to comment
Share on other sites

925 mb streamlines

Ability to view certain maps at regional levels with county lines if possible, perhaps run after the model is done running so as to not slow things down further.

I want to see a ECMWF-WRF :P

This....and especially that.

Link to comment
Share on other sites

Does this mean the same dataset that is input into the current GFS, or the hour 6 forecast data from the previous GFS run?

It initializes from the most current 0-hour GFS analysis. I'd like to initialize off RUC and use GFS for boundary conditions, but I haven't had great results in some previous tests of it...though we'll see when I have the full WRF ready to test.

Link to comment
Share on other sites

It initializes from the most current 0-hour GFS analysis. I'd like to initialize off RUC and use GFS for boundary conditions, but I haven't had great results in some previous tests of it...though we'll see when I have the full WRF ready to test.

Sounds like everything is really coming together! Looking forward to this thing!

Link to comment
Share on other sites

Showing my ignorance again, is the GFS initialization data set have lower data density than a regional shorter term model like the RUC, and hence the desire to initialize off a different data set and set boundaries via the GFS?

BTW, IIRC from Eastern, the WRF can run more than a single physics package, no?

Yes

http://www.google.com/search?hl=en&source=hp&q=wrf+physics+options&aq=1&aqi=g2&aql=&oq=wrf+physi&gs_rfai=C1BSp72HxTOuDI57SyQT7ycCdAwAAAKoEBU_Q6Dud

Which will be used?

Link to comment
Share on other sites

Showing my ignorance again, is the GFS initialization data set have lower data density than a regional shorter term model like the RUC, and hence the desire to initialize off a different data set and set boundaries via the GFS?

BTW, IIRC from Eastern, the WRF can run more than a single physics package, no?

Yes

http://www.google.co...wAAAKoEBU_Q6Dud

Which will be used?

Yeah, having our horizontal resolution so small, it would be preferable to initialize from a higher-res dataset, but I've still had really good luck with GFS and that's what the AmericanWx WRF will be starting out with, and can always be changed if needed.

You're correct that WRF can run different physics packages. Along those lines, it's a bit off base for people to refer to it as the WRF, since there are so many combinations and options available that you can actually come up with a rather unique and useful model of your own that can outdo someone else's. It's the same dynamic solver but it's hard to compare WRF to WRF if you don't know how each person has set theirs up. The Admins will release the details of the final configuration when it's ready to go.

Link to comment
Share on other sites

Snow depth (different from snowfall)

3-hr snowfall

Instantaneous snow/water ratio (be sure to mask correctly)

1000mb height (for the 1000-500mb thickness)

2m temp (NOT surface... they are two totally different values)

2m 3hr max temp

2m 3hr min temp

10m wind (again, NOT surface)

surface wind gust

Cloud cover %

500mb height anomaly

1000mb height anomaly (again, this is for thickness)

Instantaneous precip rate

Convective precip 3-hr

Instantaneous convective precip rate

Instantaneous snowfall rate (multiply the ratio by the precip rate while masking by precip type)

Precip type (not just %, but differentiating between RN/ZR/IP/SN)

Any chance of making it hourly instead of 3-hourly (NAM is hourly even though no one offers the maps online yet AFAIK)

Like this list...and to include the 1000-850mb and 850-700mb thickness.

Link to comment
Share on other sites

Yeah, having our horizontal resolution so small, it would be preferable to initialize from a higher-res dataset, but I've still had really good luck with GFS and that's what the AmericanWx WRF will be starting out with, and can always be changed if needed.

And this is exactly why we do partial cycling (based off the GFS) for the regional models...the large scale initialization is far superior to any regional model that you let cycle on itself.

Perhaps you could do a finer scale analysis using the GFS analysis as the guess (using WRF-VAR, or the GSI). Then you'll the advantage of the large scale but still be able to initialize the finer scales where the observations provide information. Feel free to PM me if you're interested, as I might be able to help out in getting a locally run of the NCEP analysis with your WRF model....or at the very least could throw some ideas at you.

Also, as an aside, the RUC will eventually be replaced with a WRF-based rapid refresh....this would presumably be useful for initializing as well...

Link to comment
Share on other sites

Hey, welcome and you should get a color coded modification to your

user identity here.

Thanks - I'll look into that..

With the LAPS system we could use either the RUC, HRRR, or ECMWF as a first guess, then bring in additional in-situ and remotely sensed observations to initialize a higher resolution WRF with what we call the "hot-start".

Link to comment
Share on other sites

Thanks - I'll look into that..

With the LAPS system we could use either the RUC, HRRR, or ECMWF as a first guess, then bring in additional in-situ and remotely sensed observations to initialize a higher resolution WRF with what we call the "hot-start".

I've been wondering about using LAPS but haven't looked very far into it yet. We're discussing setting it up at our WFO as well, so hopefully I'll be learning more about it.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...