NVIDIA discussion thread

Already at $2.4 trillion.
1709910444824.png

gcrnEPt.gif
 
Nvidia will introduce its new AI chip at a conference dubbed "Woodstock" for AI developers
1710787148992.png

An AI party​

Dubbed the Woodstock festival of AI by Bank of America analysts, GTC this year is set to draw 300,000 in-person and virtual attendees for the debut of Nvidia Corp.’s B100. This is the latest generation of its artificial intelligence accelerators that have set the world and the company’s stock alight.

Chief Executive Officer Jensen Huang is scheduled for a highly anticipated keynote Monday afternoon, and rumor is that the B100 will be the company’s first multi-die product, a breakthrough in technology whereby a larger design is partitioned into smaller chiplets for superior performance. And crucially, it’s supposed to have almost double the power of Nvidia’s current-generation H100, the powerhouse used to train many of the world’s most capable large language models.

What Huang says about Nvidia’s product pipeline is under close scrutiny and will inform how investors see the firm’s competitive moat. There is also ample hope that Nvidia will explain how it will make more money from software, going up against some of the big-name tech customers also in the audience.
GTC is a front-row seat to what’s happening in AI, hosted by a company going through a supersonic stock rally and whose graphics cards have powered so much of AI development globally. This year’s edition comes after Nvidia added $1 trillion to its value off the strength of the aforementioned H100, and the BofA analysts who dubbed this event Woodstock did so while raising their price target on the stock to $1,100 from $925.

Traditionally, developer conferences target the engineers and computer scientists building software and applications. The resounding theme of each I’ve attended this year has been AI, and Nvidia won’t disappoint on that front. What’s remarkable is that the Santa Clara, California-based company will unveil a new hardware component — something buried inside a server — that’s attracting a level of excitement usually reserved for consumer products from big tech. That’s what makes Apple Inc.’s WWDC extra compelling — last year it was the debut of the Vision Pro headset — and Alphabet Inc.’s Google I/O intriguing.

Nvidia’s H100, which I got my hands on last year, has become a badge of honor for the startups lucky and rich enough to buy a bunch of units to help build their foundation models. What’s curious is those same startups will likely never lay eyes on them. An H100 cluster is not a bag of chips you can sling over your shoulder or plug in to your PC. It ships as 300-pound server designs that go into data centers. Yet somehow the H100’s built up a cult following, with page upon page of literature, forum and social media discussion on its capabilities. It’s as close to a rock star as a silicon chip can get.

When Huang gave his first GTC keynote in 2009, there were fewer than 1,000 seated before him in the ballroom. Nvidia’s shares have jumped more than 24,000% since then and this time around there’ll be a live audience of 10,000. At the last (virtual) GTC, 22 million people tuned in to hear Jensen’s address. Expect a few more, this time around.

 
It’s crazy how far ahead of other companies Nvidia is in AI hardware. Everything in AI depends on their hardware advancement.
 
It’s crazy how far ahead of other companies Nvidia is in AI hardware. Everything in AI depends on their hardware advancement.
I'm puzzled why NVIDIA is getting involved with Chinese companies 🤮 Personally, I'd steer clear of them altogether :sick:

Hopefully, Congress passes a law to prevent any NVIDIA collaboration with those entities. They're notorious for just taking things and NVIDIA can do without them.

Those Chinese companies rely on NVIDIA because without chip access they'd suddenly find themselves in third-world territory.

Something will need to be done about this 🤮
 
Nvidia is counting on companies to buy large quantities of these GPUs, of course, and is packaging them in larger designs, like the GB200 NVL72, which plugs 36 CPUs and 72 GPUs into a single liquid-cooled rack for a total of 720 petaflops of AI training performance or 1,440 petaflops (aka 1.4 exaflops) of inference. It has nearly two miles of cables inside, with 5,000 individual cables.

Each tray in the rack contains either two GB200 chips or two NVLink switches, with 18 of the former and nine of the latter per rack. In total, Nvidia says one of these racks can support a 27-trillion parameter model. GPT-4 is rumored to be around a 1.7-trillion parameter model.

 
I'm puzzled why NVIDIA is getting involved with Chinese companies 🤮 Personally, I'd steer clear of them altogether :sick:

Hopefully, Congress passes a law to prevent any NVIDIA collaboration with those entities. They're notorious for just taking things and NVIDIA can do without them.

Those Chinese companies rely on NVIDIA because without chip access they'd suddenly find themselves in third-world territory.

Something will need to be done about this 🤮

Because the founder is Chinese?

And the trend that Nvidia currently following is a result from Chinese-American partnership?
 
Because the founder is Chinese?

And the trend that Nvidia currently following is a result from Chinese-American partnership?
You're talking nonsense. Jensen was born in Taiwan, which has nothing to do with China as it's a completely separate country. Besides, Jensen has been living in America practically since childhood, so there's no reason for him to collaborate with CCP.

It's mainly about the potential of the Chinese market (not because the founder is Chinese) lol but NVIDIA doesn't need China at all because the world is big enough. Without NVIDIA chips China would be in a pretty bad spot overnight turning into a third world country.
 

Users who are viewing this thread

Country Watch Latest

Latest Posts

Back
Top