Nvidia Stock: Buy, Sell, or Hold?

Date:

One of the hottest stocks of 2024 is Nvidia (NASDAQ: NVDA), trading up more than 185% this year, as of this writing. That stock price performance has helped catapult it to become one of the largest companies in the world (as measured by market cap).

But investing tends to focus more on the future so, while the semiconductor company and its shareholders are enjoying this great success, the question on many investors’ minds right now is whether the stock is a buy, sell, or hold moving forward. Let’s take a look at each case to help decide.

Start Your Mornings Smarter! Wake up with Breakfast news in your inbox every market day. Sign Up For Free »

The sell case for Nvidia largely revolves around future demand for its graphic processing units (GPUs). While the company reports robust demand for its artificial intelligence (AI) chips, the big question is how long will this outsized demand last.

There is a race going on among the big cloud computing companies, AI start-ups, and other tech companies to build out the best AI models, which need a lot of computing power and GPUs to facilitate their training. If sometime soon these large language models (LLMs) become good enough, spending for AI training, which has been the biggest driver of Nvidia’s GPU sales, could begin to decrease.

At the same time, Advanced Micro Devices has been able to carve itself a niche in the inference part of the AI infrastructure market, while several companies have also been turning to customized AI chips with the help of companies like Broadcom. While Nvidia has the dominant position in AI training, it could face more challenges if there is a shift in demand toward inference.

If these scenarios play out, then Nvidia would likely see declining sales and earnings and the stock would be a sell.

Image source: Getty Images.

While the sell case for Nvidia is based on future demand, the buy case is largely predicated on this as well. Demand is insatiable for Nvidia’s chips, and there is no sign of it easing. As AI models advance and become more sophisticated, they don’t just need more computing power and GPUs to train on, they need exponentially more.

For example, xAI’s Grok 3 LLM needed five times as many GPUs to train on as its predecessor Grok 2, while Alphabet said its Llama 4 LLM would need up to 10 times the computing power as Llama 3. Oracle, meanwhile, earlier said it sees no let up in AI infrastructure spending over the next five to 10 years, while Nvidia’s customers have by and large indicated that their capital expenditure budgets related to AI spending were all going up in 2025.

Share post:

Popular

More like this
Related

Silva on injuries, bouncing back and being ‘brave’

Fulham boss Marco Silva has been speaking to the...

Saquon vs. King Henry: An electrifying game within the game

Saquon vs. King Henry: An electrifying game within the...

Fantasy Football Traffic Cop: Week 13 lineup advice

The following is an excerpt from the latest edition...

Arrest made in connection with death of 80-year-old on the South Shore

An arrest has been made in connection with the...