Prediction: Nvidia Stock Is Going to Soar Over the Next 12 Months

Date:

Nvidia (NASDAQ: NVDA) is the world’s leading supplier of graphics processing units (GPUs) for data centers, which are used in the development of artificial intelligence (AI). Over the last two years alone, GPU sales have helped Nvidia add $3.2 trillion to its valuation.

The company just reported results for the fiscal 2025 third quarter (ended Oct. 27) after the market closed on Nov. 20, and they obliterated Wall Street’s expectations. It just started shipping a new generation of GPUs based on its powerful Blackwell architecture, and demand is heavily outstripping supply.

Start Your Mornings Smarter! Wake up with Breakfast news in your inbox every market day. Sign Up For Free »

Nevertheless, the stock sank 2.5% in after-hours trading following the third-quarter report. I predict shares are going to soar over the next 12 months, so here’s why any weakness might be a buying opportunity.

Image source: Nvidia.

In the past, data centers were built with central processing units (CPUs), which were great for handling a small number of specific tasks with high efficiency. However, GPUs are designed for parallel processing, meaning they can handle numerous tasks at the same time with a very high throughput.

That’s crucial when it comes to training AI models and performing AI inference, because those workloads require chips that can rapidly absorb and process trillions of data points.

GPUs built on Nvidia’s Hopper architecture — like the H100 and H200 — have been the go-to choice for AI development so far. Data center operators like Microsoft and Amazon buy tens of thousands of those GPUs and rent their computing power to businesses and AI developers, which can’t afford to build their own infrastructure (a single H100 can sell for up to $40,000).

Now, a new age of AI computing has arrived with Nvidia’s Blackwell GPU architecture. The Blackwell-based GB200 NVL72 system can perform AI inference 30 times faster than the equivalent H100 system.

A recent estimate suggests an individual GB200 GPU within an NVL72 system costs around $83,333, so developers are getting that 30-fold increase in AI inference performance for a mere twofold increase in price compared to the H100.

In other words, the Blackwell GPUs should drive an incredible increase in cost efficiency, so more businesses and developers can afford to deploy the most advanced AI large language models (LLMs).

Nvidia shipped 13,000 Blackwell GPU samples to customers during the third quarter. Microsoft, Dell, and CoreWeave have already started building Blackwell-based data centers, and Oracle customers will soon be able to access computing clusters with a staggering 131,000 Blackwell GPUs.

Share post:

Popular

More like this
Related

Chargers-Broncos Week 16 game flexed to ‘Thursday Night Football,’ a first for NFL

With both teams fighting for playoff position, the Chargers’...

Ex-Super Falcon says Fifa putting money ‘over humanity’

In 2018, UN Climate Change launched its Sports for...

Mohammad Abbas matches Imran Khan’s record in first-class cricket | Cricket News – Times of India

Mohammad Abbas (Photo by Shaun Roy/Getty Images) NEW...

Fantasy Football Week 12 Start ‘Em, Sit ‘Em

Set your Week 12 lineups with Dalton Del Don's...