-
Posts
440 -
Joined
-
Last visited
Content Type
Profiles
Blogs
Forums
American Weather
Media Demo
Store
Gallery
Everything posted by MegaMike
-
Offer me some feedback, please! I ran WRF (+ some additional utilities) from 1978-02-04 to 1978-02-09 (for sh!ts and giggles) and created plots for two different grids (parent domain - d01 and 1 nested domain - d02). I added plots of temperature, wind speed, wind gusts, visibility, precipitation type, etc... and superimposed observations when applicable. The still images were looped to create a video and I added audio to, "set the mood." Since I ran a simulation on a historical blizzard, I figured the board would find it interesting. Admittedly, my video editing skills are poor...
-
Relative humidity to absolute humidity formula
MegaMike replied to SG4891's topic in Meteorology 101
This methodology looks reasonable: https://carnotcycle.wordpress.com/2012/08/04/how-to-convert-relative-humidity-to-absolute-humidity/ You won't even need to convert your temperature (C) or relative humidity (%)... Just plug n' chug. To get kg/m3 from g/m3, just multiple g/m3 by 1kg/1000g == 1/1000. -
I agree! It's a very good question. Now that it's been a few years, I'd rank it like this; FORTRAN, NCAR Command Language (NCL; which is no longer being updated), and Python. I develop script pretty often in NCL and Python for statistical and graphical reasons. FORTRAN's important because most, if not all, Numerical Weather Prediction programs are coded in FORTRAN. If anyone's curious, the code to run NWP (for the Weather Research and Forecast model specifically) looks like this: https://github.com/wrf-model/WRF/tree/master/phys - Select any .F file in that repository. For Python, the script will look like this: https://wrf-python.readthedocs.io/en/latest/plot.html (I mainly plot images with Python). NCL's pretty similar to this, but faster when working with .nc/.grb/etc... files though. In all honesty, if you learn one language, you obtain a basic understanding of them all.
-
No. Public websites keep snowfall calculations simple because 1) scripting Cobb's/Dube's algorithm is too computationally expensive 2) in the end, it doesn't matter which algorithm you utilize for forecasts or 3) they can't write the script. Besides what I wrote above regarding positive snow depth, 10:1 ratios is easier to compute: a) If they use a precipitation type algorithm (csnowsfc is boolean wrt snow-> 1==snow, 0==not snow): snowfall = csnowsfc*LWE*10. For snowfall, precipitation type algorithms perform well... Diagnosing mixed precipitation type is problematic though. Here's an article that provides results using "observed" data: https://journals.ametsoc.org/view/journals/apme/55/8/jamc-d-16-0044.1.xm. Go to Table 1. b) If they use microphysics scheme output (SR is continuous -> from 1==all frozen precipitation to 0==no frozen precipitation): SR*LWE*10. You occasionally see websites that state, "this product may include sleet..." They likely used this methodology since SR is a function of what a microphysics scheme diagnoses (mass/concentration of rain, ice, snow, etc...). If it diagnoses graupel, sleet, and hail, SR will include graupel, sleet, and hail as well. It depends on the microphysics scheme. Most modeling systems use Thompson's microphysics schemes which means rain, ice (sleet), snow, and graupel are included in SR. Websites should be more public about their methodologies. Some are vague as He!!. Another pet-peeve of mine, Pivotal provides Kuchera's algorithm for PAID members. Why? Nobody published a paper on its accuracy and I'm sure other SLR algorithms perform better.
-
That's a good question. The algorithm takes into account "gauge losses" due to melting, compaction, etc... caused by the land-air interface. If land isn't categorized by the model at a given grid point, 0/NaN values will be produced by the algorithm. That's why there are 0/NaN values along the coastline. The (very course) GFS categorizes those grid points/locations as "water." The website post-processes those maps from the GFS by, snowfall(t)=abs(snow_depth(t)-snow_depth(t-1))+snowfall(t-1) - t==time. snow_depth is calculated within the model though. For the coding/scripting weenies: https://github.com/wrf-model/WRF/blob/f311cd5e136631ebf3ebaa02b4b7be3816ed171f/phys/module_sf_noah_seaice.F [via WRF] - search for "SNOWH." I'm assuming the GFS uses the same subroutine. For the non coding/scripting weenies: https://www.wcc.nrcs.usda.gov/ftpref/wntsc/H&H/snow/AndersonSnow17.pdf - The "new snowfall" calculation is denoted by equations 2.a though 3... It's different than what most people are accustomed to on this site. It calculates snowfall by diagnosing snow density.
-
Thanks for the feedback, everyone! Today is NC + 2 for me. I'm still getting everything setup (furniture/internet), but I'm beginning to get a sense of the area. I live in a nice community surrounded by Massachusetts' equivalent of Route 1 on steroids (15/501). The only sketchy area I've seen so far is along 15/501 near Walmart, otherwise, not bad! Obviously, NC has a much different feel... Even the dirt looks different than in NE (more clay?).
-
I'm not too big into the NHL. I rank major league sports as; NFL>>>MLB~NHL>>NBA>MLS. The Pat's are my favorite team followed by the Red Sox, Bruins, then the Celtics. I mostly watch the B's and the C's during the playoffs with the occasional regular season game. Fun fact: the first Pat's game I remember is the AFC div. game vs. the Raiders. I have no memory of the Pat's having a losing record... Super impressive! Noted. I won't stick any of my appendages inside any dark, empty void I plan on moving back to MA after a year of work for the EPA. They want me to get the "EPA experience." After that, I can work remotely from anywhere I'd like. My time in NC will be ironic. I got my degree in atmospheric science because I love winter weather. My most fond memories involve snowstorms. To name one, during the blizzard of 05' (Jan 22-24), I remember seeing cars slowly becoming engulfed by snow w/near hurricane force wind. For nearly 12 hours, I remember pure whiteout conditions. When the event ended, I remember going outside and thinking to myself, "I feel like I'm on Hoth." In some places, the snow was taller than I. To this day, nothing compares... Oh! I was ~14 at the time so I was probably 4' 6'' or something like that... So nostalgic! I created snowfall maps for fun. You can view them here (only for the NE US) if you'd like: https://github.com/msw17002/Historical-Precipitation-Analysis/tree/main/Northeast_Snowfall Some new terminology here: Palmetto bugs and noseeums. The worse I run into are horseflies. They're super annoying! I was never bitten by one thankfully. So Magpie meant casual waving! I thought he meant something different lol. Thanks for clarifying that! I look forward to any winter or severe weather event! I'm severe weather deprived... I usually get nothing significant (severe) where I live now. The accents will be fun! I always like listening to different dialects. Oh, the anticipation!
-
This is great! My mother lived in SC for a few years and her fear of the local wildlife transcended onto me. I exaggerated a little bit for a laugh, but there's sincerity with what I wrote. From what I read online (probably not be reliable), spiders like the Brown Recluse and Black Widow are relatively common throughout NC. I presume this isn't entirely true or nothing to be worried about? I'm looking forward to experiencing the culture change! I imagine there's a pretty big difference b/n people from the NE and the SE in general. The only problem I foresee is that I don't barbecue and I don't watch college sports
-
Ah! I didn't see your second comment in this thread. That's what I figured though. Getting quality controlled snowfall data is extremely difficult even in the U.S. For my snowfall evaluations, I use the GHCND archive to retrieve historical snowfall observations. You can get snowfall, snow depth, liquid water equivalent, etc... from the ftp/https server for all available stations which may include China. I'd try giving it a shot if you haven't already done so. It might help you add more data points onto your map. https://www.ncdc.noaa.gov/data-access/land-based-station-data/land-based-datasets/global-historical-climatology-network-ghcn
-
Good to know! I'm mainly worried about spiders navigating their way into my apartment then biting my hand off (joking). That's my greatest fear. COVID might be on that list too since it'll be difficult to acclimate to an area that's partially shutdown. I'll be working with ORAU as private contactor for the EPA so basically graduate school all over again. I have a 6 month lease at the Reserve at Patterson Place in Durham. Their fitness center sold me on their complex. Would you suggest getting anything like a dehumidifier?
-
Hey guys, I'm moving to Durham, NC in several days and I wanted to ask, how should I prepare myself ? I lived in New England my entire life (MA->VT->CT->MA) and I'm mainly worried about insects/snakes. To be clear, I'm not worried about the weather...
-
Active mid December with multiple event potential
MegaMike replied to Typhoon Tip's topic in New England
Pivotal doesn't do any weighting. Snowfall is readily available by the NBM itself. I wrote about it a while ago, but the NBM's (which is an ensemble) details can be viewed here: < https://www.meted.ucar.edu/winter/nbm32winter/ > You'll need to register first though. At this range, I'd recommend using it. It begins losing its advantages as the event begins to develop and a consensus becomes strong. -
Active mid December with multiple event potential
MegaMike replied to Typhoon Tip's topic in New England
Different plotting scheme... Pivotal defined their bins from -60 to +120 w/increments of 1 (approximating here) so 180 different bins in total. StormVista defined their bins from -40 to +120 w/increments of 5 (32 total bins). Since Pivotal plots their images with more bins, it appears finer than StormVista's plots. Assuming that both brands are plotting the same model without interpolating, there should be no difference. -
December 5-6, 2020 Storm Observations and Nowcast
MegaMike replied to Baroclinic Zone's topic in New England
Is that near the Frog Bridge? I lived on Jackson St. several months ago. Meanwhile, I'm in West Boylston, MA for this event. I'll be moving to NC soon so I wanted to savor this [possibly] final snow event before I leave New England. ~3/4 inches of snow on the ground w/medium sized flakes. It has been snowing since roughly 10(ish)AM. -
Good call (assuming it's a dam). I had to find the point manually, but it led me to the following metadata regarding that station: GHCND_IDENTIFIER, LAT, LON, ELEVATION_m, STATE, STATION_NAME USC00173588 45.4586 -69.8653 253.0 ME HARRIS STN If you're curious, that location observed 60 inches of snowfall for the denoted time range. The maximum for this event occurred at Pinkham Notch in New Hampshire. I might remove it from the map since it's pretty absurd or too wildly epic. The station observed 108 inches of snowfall which is 38 more than any other station lol. It was an "outlier," but based on the location, I kept it.
-
Good to hear! I ran the interpolation for all events listed here < https://www.ncdc.noaa.gov/snow-and-ice/rsi/societal-impacts > for the NE US. So there are 211 in total. I'm iterating from highest to smallest RSI (the sum of area/population impacted by climatological snowfall thresholds). At the moment, I have 19930312_19930315, 19960106_19960109, 19780204_19780208, and 19690222_19690228. I'm hoping I'll get some feedback for the interpolations (such as additional observations or ideas... The 4 Seasons!!!). I fully expect to redo some of these images to make them more accurate for my ultimate analysis. I'm surprised how detailed the GHCND archive is though. There's plenty of observations since 1900 for our area. Northwest Maine was the only exception so I had to be creative for that region. I could look at other historical events too. I just need the start and end dates (even dating back to 1900). Here are the final images I'll post until the other images process and I decide where to put them.
-
Great! Thanks, Will! dendrite, you can PM me.
-
Thanks! I tried to superimpose the observations onto the map, but it caused too much clutter. It was impossible to view numeric observations clearly. That's why I chose to use a color scale for the symbology instead. I can add the text, but I'd have to focus on smaller territories as you mention. Is there a place I can store these plots somewhere in the forum (maybe a new thread)? I have 208 other plots I'd like to upload once the script finishes.
-
Hey guys, I wanted to ask, would you change the formatting of the attached snowfall () map? If there's anything that you'd add, remove, or edit, please let me know. I'm prepared to batch over 200 other snowfall interpolations (1900-now) that I created using GHCND/PNS observations and Python's PyKrige library (https://pypi.org/project/PyKrige/). I'd like to keep a gridded historical archive for other purposes as well. For example, I'd like to 1) create plots for the number of occurrences of 4+,10+,etc... events (on a 0.025x0.025 degree grid) ...and... 2) determine the maximum snowfall amount and index the event for all locations (on a 0.025x0.025 degree grid) This isn't work that I plan on publishing since there are other agencies with similar and likely, more accurate plots (NWS offices, https://www.nohrsc.noaa.gov/snowfall_v2/, and https://www.ncdc.noaa.gov/snow-and-ice/rsi/nesis). I created these interpolations to conduct an analysis on a constant grid. I'll admit, I did it for fun too! I may also create a large poster of all plots altogether . Anyways, I decided not to plot the legend. The plot's elements are: shaded circles: GHCND/PNS (I added PNS observations when/where applicable) observations. shaded crosses: GHCND/PNS observations that were determined to be spatial outliers... I found outliers using a methodology similar to this one < https://pro.arcgis.com/en/pro-app/tool-reference/spatial-statistics/h-how-cluster-and-outlier-analysis-anselin-local-m.htm > using a 50km search radius, IDW weights to the 2nd power, and z-scores of -1.280 and 4 (assuming a normal distribution). shaded: Kriging interpolation onto a 0.025x0.025 degree grid light blue shade: Lakes (from https://scitools.org.uk/cartopy/docs/v0.14/matplotlib/feature_interface.html) dark blue shade: Atlantic (from https://scitools.org.uk/cartopy/docs/v0.14/matplotlib/feature_interface.html) gray segments: Interstates I also added shapefiles for Canada's provinces, USA's states, and USA's counties. Ideally, the observations should match the color of the shaded interpolation.
-
Nobody gives the NBM the love it deserves! Compared to the 10:1/Kuchera SLR algorithms, the NBM (conceptually) is far superior (especially compared to deterministic output). For details, visit this link: https://www.meted.ucar.edu/winter/nbm32winter/. You'll need to create a free account to access the module, but it's worth it! Essentially, SLR's are computed from deterministic/ensemble output via multiple SLR algorithms weighted differently by modeling system(s)... The Cobb SLR algorithm IS incorporated into NBM's snowfall computation (including 3 other algorithms). I still can't find a public website that provides graphics for the NBM so I'll attach today's 00z, 06z, 12z, and 18z snowfall accumulation plots here. It's straightforward to get and process: 1) Access the 'http://nomads.ncep.noaa.gov' server utilizing a URL to 'request' output from a specific model run. As an example, http://nomads.ncep.noaa.gov:80/dods/blend/blend20201029/blend_1hr_00z will obtain NBM's 2020-10-29 00z simulation (1hr time-steps). 2) You'll get all of NBM's variables for all times on a lon/lat grid once you obtain the JSON file... Extract the accumulated snowfall variable called, 'asnowsfc' (for the final available hour) then convert mm->inches. 3) Plot the data using optional mapping resources. It takes a while to make it look pretty, but overall, it's not too difficult. If anyone wants to view the script, PM me! I'm uploading my graduate school script to GitHub. Anyone can access/use it if they'd like.
-
MET courses are basically applied mathematics/physics courses. If you cannot understand mathematical and or physics concepts, you'll likely struggle in all MET related courses. Even though you didn't ask, I'd recommend taking calculus (and or differential equations) at a community college before committing to a MET program (if you're not feeling confident). This will help you gauge the waters of the curriculum. In my opinion, most people can succeed in all MET, mathematics, and physics courses as long as they work hard and do not quit. Q. "What else should I expect assuming I'm able to at least have a basic comprehension of the calculus classes?" A. Expect mathematical terminology within most MET equations. For example (QG theory), know what a partial derivative/del operator is and how to apply these terms within a finite area/volume < https://www.meted.ucar.edu/labs/synoptic/qgoe_sample/index.htm#:~:text=The Quasi-geostrophic (QG),analyses at different pressure levels > "Are the calculus equations the most difficult part of MET courses or are there other stuff that could pose an even greater challenge for a student?" A. No... Differential equations is the most "difficult" part of MET courses. They don't get too complicated though so I wouldn't worry about it. Just focus on the mathematics curriculum. Q. "Are the physics courses just as difficult?" A. Absolutely not. Not for me anyway... Again, just focus on the mathematics curriculum. Q. "For a long term goal, I'd be more interested in Broadcast Meteorology, does a bachelor's degree suffice or is grad school pretty much expected for the career?" I know a lot of colleagues who are broadcasting (as meteorologists) at the moment. Three of which declared ATM as a minor... They're all doing well. You can always get a bachelor's degree in television studies (broadcast concentration) and get a minor in ATM.
-
Summer 2020 Banter and random observations
MegaMike replied to Baroclinic Zone's topic in New England
Yup! That sounds like you already have it installed. You can install/configure certain libraries and modules using conda (windows terminal) and pip (anaconda terminal) for Jupyter Notebook. I faced similar issues when I first started scripting. Basemap took a lot of time to install/configure... Overall, it took me ~1 year of constant scripting to understand Python pretty well. I started with MATLAB, but decided to move to Python since the script is compatible w/Linux (plus all of the utilities!). I'm sure you can find a Python class as it relates to meteorology, however, I learned Python best by writing script that I was interested in. My first script I created was 1) obtain GHCN-D snowfall observations then 2) identifying suspicious observations using a spatial outlier algorithm. The intent was to use it on observations that were provided by members of this forum so I can label weenies w/statistical support -
Summer 2020 Banter and random observations
MegaMike replied to Baroclinic Zone's topic in New England
Anything is possible with Python! That is what my script does (.csv output), but I only consider liquid water equivalent, snow water equivalent, and snowfall. The good thing about Python is that there are utilities/modules/etc... that already exist that you can install then run pretty easily for specific datasets. For example, for GHCN: https://github.com/aaronpenne/get_noaa_ghcn_data... Avoid any utility or archive that requires you to pay. This data is public. I'd suggest installing Jupyter Notebook. https://jupyter.org/ It's basically a (somewhat) interactive interface that runs Python script. It comes with a lot of pre-compiled modules and libraries too. You can message me if you'd like more details or run into any problems. -
Summer 2020 Banter and random observations
MegaMike replied to Baroclinic Zone's topic in New England
If you're looking for max/min's (temperature, wind speed, wind gust, etc...) and or precipitation, GHCN-D is great. I'm not sure if there's an interactive website that hosts this data though... I use Python script to extract daily observations by station and date. https://www.ncdc.noaa.gov/data-access/land-based-station-data/land-based-datasets/global-historical-climatology-network-ghcn