Jump to content
  • Member Statistics

    17,724
    Total Members
    7,904
    Most Online
    sinterpol
    Newest Member
    sinterpol
    Joined

January Banter 2025


George BM
 Share

Recommended Posts

3 minutes ago, pazzo83 said:

i'd actually be curious what kind of hardware these models run on.  Do they require a lot of GPUs like machine learning models / LLMs?  Or just a ton of CPUs for parallelization?

https://www.ecmwf.int/en/computing/our-facilities/supercomputer-facility

Basically a big Unix environment using a supercomputer. I get to do the same on the daily, some cool equipment.

  • Like 1
Link to comment
Share on other sites

Dale City, Virginia has a nice dry cold airmass. Your high temperature was 30 degrees. It is now 26 degrees with a dewpoint of 10 degrees above zero. Winds are northerly at 5 to 13 mph. You have a massive winter storm incoming, with good amounts of pow on the way. Ground is cold. Surface Frigidization will be in play. This means roads and sidewalks WILL CAVE, from Flake 1. Enjoy, everyone! I will be watching live cams in your area, cheering your storm on!!!!!!

Link to comment
Share on other sites

9 minutes ago, biodhokie said:

https://www.ecmwf.int/en/computing/our-facilities/supercomputer-facility

Basically a big Unix environment using a supercomputer. I get to do the same on the daily, some cool equipment.

interesting - basically a F ton of cores for parallel execution.  All the massive LLMs are trained on huge clusters with thousands of GPUs because the core calculations involve gigantic matrices, which GPUs are really efficient at.

Link to comment
Share on other sites

27 minutes ago, Jebman said:

Its a HECS.

Hey does anyone know where I can find good live streaming webcams in north Virginia where I can get my snow fix tomorrow night into Monday?

Traffic cams Jeb. By the way had an amazing Jebwalk yesterday evening in a pounding squall with 30 MPH winds. It was invigorating. 

  • 100% 1
Link to comment
Share on other sites

14 minutes ago, biodhokie said:
5 minutes ago, pazzo83 said:

interesting - basically a F ton of cores for parallel execution.  All the massive LLMs are trained on huge clusters with thousands of GPUs because the core calculations involve gigantic matrices, which GPUs are really efficient at.

Pretty much, depends on how the person submitting to their cluster submits the jobs.

 

They're using the slurm scheduler (which is what we use as well) which will take a single job and every line of the command file ends up being a subjob. Effectively you can create a parallelized pipeline that results in a single cohesive output.

 

What has me curious is what their pipelines look like and how they're coded, mainly to see how I could apply it to my own work seeing how they constantly run the same pipeline to output the model results.

Link to comment
Share on other sites

I'm sure the Lamar critics in the media(who didn't actually watch the game) will look at the 50% comp percentage and say Lamar was awful and should be eliminated from MVP consideration. This will 100% be the take from Chris Russo, who loves to hate on Lamar and thinks Allen should win, just because. Ravens beat the Bills 35-10 and Lamar played well while Allen was terrible. 

Link to comment
Share on other sites

30 minutes ago, biodhokie said:

Pretty much, depends on how the person submitting to their cluster submits the jobs.

 

They're using the slurm scheduler (which is what we use as well) which will take a single job and every line of the command file ends up being a subjob. Effectively you can create a parallelized pipeline that results in a single cohesive output. What has me curious is what their pipelines look like and how they're coded, mainly to see how I could apply it to my own work seeing how they constantly run the same pipeline to output the model results.

we had something similar when I was working for a couple larger investment banks - but the library we used to submit jobs basically tried to mirror the multiprocessing library in Python so it was fairly simple (and probably a fraction of the compute you are dealing with).

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   1 member

×
×
  • Create New...