5 Words From Jensen Huang That Every Nvidia Stock Investor Needs to Hear Before Nov. 20

Date:

With a market capitalization of $3.6 trillion, Nvidia (NASDAQ: NVDA) is currently the world’s largest company. It created $3.2 trillion of that value over the last two years alone thanks to surging demand for its data center graphics processing units (GPUs), which are the most popular in the world for developing artificial intelligence (AI) models.

Nvidia will report its latest financial results for its fiscal 2025 third quarter (which ended on Oct. 31) on Nov. 20, and the company is forecast to deliver record revenue led by its data center segment.

Start Your Mornings Smarter! Wake up with Breakfast news in your inbox every market day. Sign Up For Free »

In an interview with CNBC last month, Nvidia CEO Jensen Huang made a series of very positive comments about the company’s new Blackwell GPU architecture. He said five words, in particular, that should have every Nvidia stock investor excited ahead of Nov. 20.

A substantial amount of computing power is required to develop AI models, and most businesses can’t afford to build the necessary data centers because chips are so expensive. Nvidia’s H100 was the go-to data center GPU for AI development for most of last year, and a single unit would cost up to $40,000. Some AI applications required tens of thousands of them.

That’s why well-resourced tech giants like Microsoft and Amazon are building centralized data centers and renting the computing capacity to enterprises. It’s a profitable venture for them at scale, and it makes AI financially accessible to businesses that can’t build their own infrastructure.

Nvidia’s new Blackwell architecture delivers an enormous leap in performance. The Blackwell-based GB200 NVL72 GPU system can perform AI inference at 30 times the speed of the equivalent H100 system. Each individual GB200 GPU sells for between $30,000 and $40,000, which is around the same price as the H100 when it first came out.

That translates to an incredible increase in cost efficiency, meaning some of the most advanced AI models will be financially accessible to a wider range of businesses and developers.

Microsoft is rumored to be the biggest buyer of Blackwell GPUs so far. It told investors it allocated $20 billion to capital expenditures (capex) during its fiscal 2025 first quarter (ended Sept. 30) alone, most of which went toward AI data centers and chips. That followed $55.7 billion in capex spending in fiscal 2024.

Similarly, Amazon spent $30.5 billion on AI infrastructure in the first half of 2024, and it’s on track to spend almost $45 billion in the second half, which will take its total investment to $75 billion for the year.

Share post:

Popular

More like this
Related

Curran: Patriots ‘finally showed a pulse’ in loss to Bills

Curran: Patriots ‘finally showed a pulse' in loss to...

Maye calls conversations about Mayo, Van Pelt job security ‘some B.S.’

Maye calls conversations about Mayo, Van Pelt job security...

Mark Davis was so delighted about the Raiders’ win even though it likely ruined their NFL Draft position

By winning 19-14 against the Jacksonville Jaguars on Sunday,...

Week 16 Pulse Check: Unlikely heroes emerging in the fantasy football playoffs, but will it continue?

It’s nearly time for championship week! We’re done looking...