NVIDIA Made a CPU.. I’m Holding It.

Try Pulseway FREE today, and make IT monitoring simple at:

I'm at the Gigabyte booth at Computex 2023 where they're showing off bonkers new hardware from Nvidia!

Discuss on the forum:

Immersion tank 1 A1P0-EB0 (rev. 100) :
Immersion tank 2 A1O3-CC0 (rev. 100):
Big AI server (h100) – G593-SD0 (rev. AAX1):

► GET MERCH:
► LTX 2023 TICKETS AVAILABLE NOW:
► GET EXCLUSIVE CONTENT ON FLOATPLANE:
► SPONSORS, AFFILIATES, AND PARTNERS:
► EQUIPMENT WE USE TO FILM LTT:
► OUR WAN PODCAST GEAR:

FOLLOW US
—————————————————  
Twitter:
Facebook:
Instagram:
TikTok:
Twitch:

MUSIC CREDIT
—————————————————
Intro: Laszlo – Supernova
Video Link:
iTunes Download Link:
Artist Link:

Outro: Approaching Nirvana – Sugar High
Video Link:
Listen on Spotify:
Artist Link:

Intro animation by MBarek Abdelwassaa
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0  
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0

CHAPTERS
—————————————————
0:00 Intro
0:22 Meet the Grace Super Chip!
1:22 We got permission for this…
3:13 ..but not for this.
4:40 Now for the GPU!
6:13 That's where the Interconnect comes in
7:32 There's "old-fashioned GPUs" too
8:35 Crazy network card
11:00 outro

admin2

69 Comments

Henry Zeigler

A green cpu with a blue gpu may soon be possible.
Scary times.

    Justin Biggs

    Scary times with pricing and greed. But interesting times hardware/software technology wise

    Stephen Kennedy

    What the hell kind of bizzaro world are we in?

    bob jay

    Highly doubt the green goblin is interested in making a cpu for peasants like us.

    BudgetGamingEnthusiast

    It already is

    The world is ending

macias

I love how Linus just HAS to disassemble everything he gets his hands on

    superintendent

    thats how he rolls

    GrumpyRatt

    Someone somewhere was holding their breath saying don’t f’ing drop it Linus don’t you dare drop it 😂

    Dilbertron

    thats how he rick rolls

    Jordi

    He could not use the LTT screwdriver though! What a missed oportunity!

    Ander01SE

    Imagine if it was GN Steve…

Krolitian

I love the idea of Linus just going into conventions and just unscrewing random tech he finds all over the walls without permission.

    faceboy

    sinus lebastion is just too dangerous at conventions

    PBR JOEYBOY99

    I was about to comment this myself, goes to show how much the companies trust him now

    G

    😂😂😂 damn that’s funny 🙂

    R

    seems like something he does everywhere he goes

    Vysair

    That’s probably what he do in the old day. On-the-spot permission with no prior planning

ANeM

For reference on the name: Grace Hopper was the US Navy computer scientist who wrote some of the earliest theory on machine-independent programming languages and is credited for writing the first compiler, two incredibly important steps towards modern computing.

    Shaun Streeter

    yes, hearing ‘grandma COBOL’ mentioned did bring a smile to my face

    Crimson-fox Twitch

    yeah, NVIDIA names a lot of their architectures after important people in science history.

    Markus

    also ranked Rear Admiral on top of that

    legerdemain

    Grace Hopper has a Posse.

    Cruz Immel

    She was also the first person to coin the term ‘bug’ in computer sciences because she found an actual bug in one of their systems and then taped it into the maintenance log book.

Graham MacDonald

Cool stuff! I’ve always wondered what cutting edge server tech stuff was capable of, enjoyed the vid!

Ljudet Innan

I’ve never been so nervous watching Linus holding new tech.

    mike

    the intel fab tour was more nerve wrecking lol~ even though he wasn’t holding anything like here, his hand gestures and body movement so near all those precision machines after saying we shouldn’t touch anything was true anxiety. (oh yea, and he actually did pat machines anyway) XD

    Phoenux

    Something tells me the display units are probably nonfunctional if they’re willing to let Linus take one off the wall and open it up with little to no supervision.

    Uberfish

    @Phoenux Nope. I’m sure they are fully functional hardware items. I’m kinda sad he didn’t drop one!

    Next week: Repairing the $150,000 server we had to buy after breaking it!

    Рыбкин Павел

    @Uberfish haha 😀 I need to see it! But I gues it costs much more.

    Ingwie Phoenix

    If you’ve watched him for years, you get used to it.
    Gold controller, 10k Intel CPU (which he dropped) are just among the first things that come to my mind. xD

Makumaku

That Grace Hopper is a freaking piece of art. It makes you want to code an entire OS and Game just to test it. Just imagine what a crazy project that would be.

    Ericmeister

    I love the homage to grace Hopper btw. Excellent naming

    Allan

    How well can Rollercoaster Tycoon 2 run on this thing if natively translated, that’s what I’m wondering

    Chris Housby

    ​@Ericmeister imagine going back and telling her how big a transistor will become. The ones she developed software languages with were vacuum tubes several inches long.

    brodriguez11000

    We know what future Crysis developers will be using.

Artamere Enshort

It’s a real moment of pure pleasure, to see Linus with eyes that shine, like a kid in a toy store

    A Potatoman

    until he drops it

Dark

It’s ironic that in an era where we went from needing dozens of dedicated cards to having most things handled in software, we are now going in reverse: Hardware processing things with dedicated chips or cards.

    Ferretsnarf

    About 10 years ago when I was in college for Electrical and Computer Engineering this is actually one of the things we were talking about. We’re more or less hitting a brick wall in miniaturization and increasing the raw speed of individual components. How do we improve performance when we can’t miniaturize our chips any more than we already have (At this point we’re talking about transistors that are so small that you can count their width in atoms)? Well you offload tasks into different chips (TCP/IP on the network adapter and like Linus showed putting the encryption workload on the adapter). If you find there’s a specific workload that you’re constantly asking your general-purpose CPU to do, it might start to make sense to put that task on a specialist chip rather than putting it on your CPU.
    ASICs are on the rise and expansion cards are coming back.

    autohmae

    Do you remember some people were saying: the end of Moore’s law ? That’s what is going on here…

    Ferretsnarf

    @autohmae Yeah, we were talking about that at the time as well. I avoided saying it because I kind of hate talking about Moore’s law online – you almost always get some kind of blowback when you talk about moore’s law being dead. On the consumer side of things I could almost see why you might think moore’s law isn’t dead. We’re not really seeing smaller+faster all that much anymore. We occasionally barely scrape by into a smaller node, but you’re not really getting faster and more efficient transistors out of it anymore, instead you’re mostly cramming more stuff onto the die and subsequently aiming a firehose at it to hope you cool it enough to not explode.

    autohmae

    @Ferretsnarf do you know why consumers with some technical knowledge don’t know it’s dead ? Because of the marketing with CPU node process size.

    Just Some Dinosaur

    @Ferretsnarf This has happened before, and ASICs have always had a need over general purpose processors. Our reasons for stagnation in tech is more of a complex problem as opposed to exclusively being down to physics. As it is, quite a few clever people in fields of research have proposed numerous workarounds that are plausible in theory, but simply not testable at the moment and not feasible on a wide scale, especially without aggressive grant funding like in the past.

    If anything, I would say that we’re actually quite lucky that AI has brought about a bit of a resurgence in potential general optimization and advancement.

    Finally, Moore’s law was always more of a “loose observation” and never intended to be indefinite, with Moore himself saying that he was certain the trend would not hold for long and become irrelevant to the next abstract steps in advanced design.

Sergio J G Silva

The smile on Linus’ face is like a 80’s kid going to a toy store… you know you won’t leave the place with anything, but just being surrounded with the toys is a joy

Joseph Lawhorne

I think the most surprising thing to me is that Gigabyte has enterprise class hardware.

    Konnor J

    Ah but notice they did NOT show you the GB power supply?

    Lmfao

    Xsphyre

    Pleaase dont say that about Gigabyte

ed gillis

ARM is a RISC instruction set. The Hewlett-Packard Packard PA-RISC was way ahead of its time. I worked on the first HP 3000 on MPE & HP 9000 HP-UX systems. Some of the desktop workstations like the tiny 715 systems were incredible in 1980’s.

    Konnor J

    Ah the toys of my youth! I worked with some of that wayyyy back when along with many other goodies that all ofnthe winbloze babies wouldnt have any clue what it is now nevermind how to use it and due to the millenials andnbeyond idiotic overly entitled arrogant bs they dont even appreciate that which was gained to make the current toys evennpossuble via our hard work long before they were a set spot on cheap hotels sheets

    Daniel Woods

    Beat me to this lol good comment

    Kelle Cetraro

    You beat me to this comment, but I made it anyway 😂

    I’m sort of scratching my head as to why he’s acting like it’s a new thing… Just for novelty I still have a SUN E450 still running and productive 😂

Just A Tool

This shows the power of being able to do almost anything in public as long as you have a cameraman with you.

    autohmae

    AND have a proven audience.

    NadirQG

    And have the most popular tech enthusiasts channel

Yonatan Avhar

Man, the speed at which these things rip through your wallet is insane

TheSickness1234

Nvidia: we can connect multiple GPUs in multiple racks into one room filling huge Gpu
Also Nvidia: SLI…yeah that don’t work

    nhiko999

    Just in case: SLI works but it’s mainly dépendant on the type of work asked to the GPU, and games are not benefiting much of the multiple nodes. For scientific computation however…

    Emil Klingberg

    Well, SLI is actually a great technology, but its requires high competency from game developers, and lets just say that’s not too common. Look at simulation programs or modeling and raytracing software and you realize how awesome sli setups are when running proper software.

    Francesco Barbieri

    ​@Emil Klingberg on point! If you want to see a well optimized game for sli/cf, have a look back at crysis 2! May not have been the best in the series, but multi-GPU support in that title was wildly effective!

    TheSickness1234

    @Emil Klingberg yeah, feels like game devs these days need ‘guardrails’ enforced by Sony and a ‘one click – enable’ to implement feature button.
    (Thinking about the interviews on Morres law is dead channel)

    For the mentioned use cases you can forget running that on consumer cards as none have the connectors anymore

    Krozar TAL

    The difficulty with SLI is that is has to raster frames real time for 144+hz display on a screen. GPU offloaded work, such as NN machine learning, is a much easier task to parallelize.

Isa Nova

This is 100% the content I live for, awesome video. Keep up the good work!

Simon Bolton

What I’m beginning to notice is, “compute modules” are essentially the PC and the motherboard isn’t really a motherboard anymore, its just an I/O interface for the compute module. Which if you remember is how we used to make computers 40 years ago, just with wildly more advanced tech.

    Krozar TAL

    Yep. Everything got shoved into an ISA slot. Keyboard controller, mouse controller, VGA card, memory, etc.

    Bass Player

    Reaching limit shrinking and motherboard a bottle neck. Buy in fab package.

    brodriguez11000

    Still do with industrial computers. e.g. VME, etc.

Taylor Straton

For Linus to not drop whatever he’s holding immediately after saying “I don’t even wanna know what this thing costs” is pretty astounding to me.

Theo Higgins

Ignoring the insane numbers related to this thing, what stuck out to me is that I really like the naming scheme.

Nvidia naming their GPU generations after famous computer scientists has always been cool, but it stuck out as weird to me that for Ada Lovelace they broke tradition and called it her full name instead of just “Lovelace”

But here using the full name now makes perfect sense, by going with “Grace” for the CPU, “Hopper” for the GPU and “Grace Hopper” for the entire thing you get both an intuitive naming scheme and an even cooler way to pay respect to these greats.

Now I’m imagining a world where we had “Johannes” or “André-Marie” CPUs/superchips though, doesn’t roll off the tongue as easily lol

Tim Seguine

The network card you showed at the end reminds me a lot of IBM Z-Series programmable IO. And yeah, offloading IO stuff to a coprocessor is the secret to crazy high throughput. You guys have seen in your reviews of desktop products how bogged down the system gets with high speed IO if it needs to be handled all by the CPU.

Leave a Reply

Verified by MonsterInsights