Jump to content

Typhoon Tip

Meteorologist
  • Posts

    43,913
  • Joined

  • Last visited

5 Followers

About Typhoon Tip

Profile Information

  • Gender
    Not Telling

Recent Profile Visitors

55,158 profile views
  1. like. some in here will lose it if any kind of weather happens ? not sure what was so distracting, warm or cold, about any model run for that matter this morning.
  2. weird warm sector ... PHL-NWR and all point surrounding and in between have DPs between 43 and 50 F...whilst clearly by other obs, a warm front has soared N ... One might of thunk DP of 60 in warm sector but that's a dry warm dose punching up the coast. In fact, said warm boundary is nearing or just passing through HFD as you're reading this statement. Winds bump S and go gusty SW across CT and the sat loops have day glow skies if not sun splashing there. Moving rapidly NE. We're likely to get that even up to Rt 2 and SE NH soon... We won't make 70 but I could see it surging into the mid 60s.
  3. Yeah...I guess what I was in part dancing around is whether or not we should expect general product outages. We may go through a "bug" period where modeling gets black out times. Prooobably why they chose August? Since that is the most quiescent time of the year, if you're going to do a huge disruptive product overhaul, down to the software level - which I also don't have a lot of faith is likely to be properly hardship-estimated ... - that makes sense to do it during dog-day doldrums. I've been in several different software capacities over the last 25 years ... I can tell you... when the issue is code-base level, the time estimates are almost always between 2 and 5 times longer in reality then is planned. Any other software engineers in here will know what I am talking about.
  4. You know it just occurred to me... Think how much work is going to have to go into the Internet technology bases, pan-systemically, for this up-coming August change. All industry reliance - star there and start thinking about 'turning off' all these guidance' I wonder... hm. One way to do it is to leave the other product suite ( old ) as is, while giving a time and chance for the source provider-ship to catch up. Running in parallel might be expensive, but provided a chance for deep, deep product integration to be root-canalled. NAM/MOS web-pages, ... crip filing and FTP automations... Companies that may rely on those automations... hard to know where to begin. Graphics engines? my god. massive, massive overhaul in the wholesale industry. The other way to handle ( if smart ) is to induct Claude or Gemini CODEX AI ...like real fast, and start drafting up whole new web architectures - like head start it. Expedience being the objective. Because in tech parlance, August is in ten minutes from May. When the existing took the last 20+ years of human engineering to create all that product suite and deeply rooted sourcing, and product reliances ..et al - yikes. The whole system can't really just be stopped on a dime because NOAA clicks a mouse to fire up these new tools. I mean what am I missing... ? As an afterthought, you wonder if the vendors may have already been working background with NCEP - perhaps developing against a beta system...
  5. You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary. A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run. But therein is the source of the human error...right? In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some odd days, the interpretations become just a veneer technology layer at the output end of each cycle - more for readability. Yeah ..that would soup to nuts be a bad day for weather forecasters as a profession, wouldn't it?
  6. Yeah, but those are interpretive tools, after that fact. Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL Drool humor aside ... one might suspect general error improves, in step, with/if the raw data source improvement; because obviously interpretive algorithms are tapping into less error to begin with...etc.
  7. Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up. It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place? The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. Digression there, but in the same sense an evolution of these modeling systems is just as inevitable. Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line". This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium. .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption. Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real. Keep the real; toss the sci-fi. What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days' In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth). A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost. But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech. With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness. Sorry...just some op ed'ing.
  8. Meh... I know your joking with him but ...that mid 70s bias on spring warmth is coming from ( more likely ... ) media's fault. The problem with that is that even tho CC should be loud in the media, just not for the reasons they go about doing so: information for dollars. I mused years ago ... the moment in history humanity figured out how to turn TV channel changes, mouse clicks, and thumb swipes into pennies, we were doomed. They only saturate the J.Q. less informed, open to suggestion masses ( which let's face it... probably 90%+ of all brain boxes alive) whenever there is a record breaking this, that, and the other thing. Discussing how anything works? Isn't as profitable. So...J.Q. comes away white washed that everything's supposed to be hot. Sometimes... CC causes a cold winter, right? good luck with that - Seems like the government is attempting the same psy-op now with UAPs, though holy bejeezus why?! They're now releasing classified material related to plausible extraterrestrials. First of all zomb! But why now? If this has been going on since the 1940s... what's is it about 2020+ that requires all this hubbub. Well... I guess the way it's gone, there's been some "deep state" covering up reversed engineering of 'you know what'. There's purported now covert anti-gravity propulsion, along with other exotic techs. Like 0 point energy - basically tapping the 10 to 19th power eV background universal energy constant ... blah blah. If that came out and was made known/ubiquitous? Game over. No hierarchical "echelon" form of humanity social order. That includes power. That includes wealth. That includes being tethered to the societal toil chain as dollar valued slave labor. It would pretty much solve all energy needs, shorting demand down the individual level. Doesn't take a massive descriptive essay to see where that intuitively goes. For starters, the economic engine that keeps humanity on the same page, seizes like it ran out of oil and all at once stopped. No pun. That's just one facet. Wildly digressing here... but, if there's some how, some unlikely way even half a kernel's truth in all that, it's the greatest aspect in human history since "Groont" picked up a stick with one end on fire and saw a future. So how an why are these classified things being ordered released? Seems with so much at stake... so much absolute disruptive power, it's a major wtf moment in history. It already is even if all that shit isn't true, because the goddamn highest order of Government is exposing. whaaa Could be a diversion mm. Not a huge leap I guess with this cast of White House characters. Any way out of culpability in the Epstein files and a failing war, fluff UFOs (refashioned as UAPs). Maybe.
  9. Probably every summer in about 22 years ... Just another threshold moving on past ...
  10. I can see/sense Brian's complaint on that. A cursory scan of the higher res vis satellite eyeballs an ~ 50 to 80% sun vs 50 to 20% cloud obstruction, yet you seem to be majority shaded over your neighborhood.
  11. All the runs seem light on Mon/Tue. The GFS is spuriously cooling the 850 mb between 15 and 21Z on Monday, between the Presidential Range and D.E.M., extending barely +10C down into interior SNE. There is no reason or cause that can be synoptically identified to do that. Meanwhile, the CMC has even lower ceiling level RH values, with no means to cool the 850mb ...but opts to keep the BL less mixed and (apparently) doesn't think the sun's strong enough to make the difference there - which is suss because if it is right about a hotter Sunday, the night's going to be elevated some ...setting up a higher Monday launch. So not likely on that - I didn't venture into the Euro ...I just figured it's upper 70s under general 566 to 570 dm hydrostats must be the conspiracy to hide CC from the model outputs showing up. haha.
  12. I thought it interesting ( albeit likely esoteric ) that the CMC bursts into the mid 80s on Sunday. It has 85 at 18z BDL-to Lawrence MA. I was looking that 850s on all three and thinking that given lower ceiling RH/sigma level values ( meaning more sun), WNW wind and now solar max power lasing the countryside, those 73-75's looked a little cool in the Euro and GFS. Rarely are we faced with a circumstance where a D5 looks more reasonable in a CMC model run. heh. I don't know about 85s but I could see the Eur/GFS 2-ms being a bit light there. The difference is the CMC is warmer at 850 ...seems to be trying to commit to the early week continental delivery scenario, earlier. So yeah, the others may be couple warmer, and if the CMC ends up right ...the former will end more substantially so.
  13. I was just coming in here to wax the virtue of that synoptic look on Saturday, ...opened the app, this was the first sentence I saw ^ Yeah... sensory overload. Top 3 day... perhaps taking first place - though it's early in the race. It feels like a trophy though; hard to top 74, light wind, lilac aroma sun-gasm weather. That's on the backside of the weak nor'easter/closed but still progressive circulation thing later Thur into Fri ( I know you're aware of all that... just sayn' ). It's a sign of the times, signifying having finally escaped the cold breeze hangover pattern. It seems the oscillation behavior that comes with will be mild to warm to mild to warm as opposed to cool to mild... with one or two CC exaggerated days mixed in.
  14. The whole 2m philosophy needs a make-over. It's abysmal. That over mixing thing is a separate error to the under-sold high temps that happen in the actual heat wave. It's like errors in both directions are true. weird. It seems... beyond the late short range the model mixing is over proficient... but as it comes into shorter vision, the model corrects but then it just assigns the sfc sigma as 1000 mb level and calls that the 2m. There's a new error anyway, world over, having to do with this new phenomenon surrounding 'synergistic heat waves' - they exceed everything. We've talked about this in the past... There's been several between Iberian Penn as far N as London. Australia. Siberia... the steppe country near the Urals over the hills into Moscow. The Pacific NW... "sort of" 2012 in the lower OV - it tried to get in here but was cut off by a corrective derecho that pretty much undermind the ridge for everyone. These are a different thing altogether where nothing gets them right because they are literally synergistically created - like emergent properties where the temp just runs away beyond all convection means of forecasting, machine to man. I don't think there was very many this last year but it was a reasonably well-coupled cool ENSO mode so that maybe has some capping aspect. But oh gee, guess what ...we're heading into super dong.
  15. Hopefully we can lay down 2 to 3" of basin coverage from that nor'easter on Thursday ...
×
×
  • Create New...