1 Stock-Split AI Stock to Buy Before It Soars 450%, According to a Wall Street Expert

Date:

Philip Panaro is a founder and former CEO of Boston Consulting Group (BCG) Platinion, a division of BCG that offers technology consulting services. During an interview in November, Panaro told Schwab Network that Nvidia (NASDAQ: NVDA) could hit $800 per share by 2030 due to its leadership in artificial intelligence (AI) accelerators. That forecast implies about 450% upside from its current share price of $145.

Of course, Nvidia has been one of the hottest stocks on the market. Its share price has surged over 900% since the late-2022 launch of ChatGPT led to an exponential increase in demand for AI infrastructure. The company conducted a 10-for-1 stock split earlier this year to compensate for that price appreciation, and another split may be in the cards if Panaro is correct.

Start Your Mornings Smarter! Wake up with Breakfast news in your inbox every market day. Sign Up For Free »

Here’s what investors should know.

Nvidia holds 98% market share in data center graphics processing units (GPUs), chips used to accelerate complex data center workloads, such as training machine learning models and running artificial intelligence applications. One reason for that dominance is superior chip performance. Nvidia regularly achieves the highest scores at the MLPerfs, objective tests that benchmark the capabilities of AI systems.

But there is another reason Nvidia accounts for virtually all data center GPU sales: It spent the better part of the last two decades building an expansive software ecosystem. In 2006, Nvidia introduced its CUDA programming model, a platform that now spans hundreds of code libraries and pretrained models that streamline AI application development across use cases ranging from autonomous cars and robots to conversational agents and drug discovery.

Additionally, Nvidia has branched into other hardware verticals, like central processing units (CPUs) and networking gear. Indeed, Nvidia has a leadership position in InfiniBand networking, currently the most popular connectivity technology for back-end AI networks. The ability to integrate hardware components into a cohesive computing system lets Nvidia build data centers with the lowest total cost of ownership, according to CEO Jensen Huang.

Here is the big picture: Competing with Nvidia is exceedingly difficult. Its GPUs are not only the fastest AI accelerators on the market but are also supported by the most robust software development platform. And Nvidia has another key advantage in vertical integration. Consequently, while it has more pricing power than its peers, Nvidia systems are less expensive when accounting for direct and indirect costs.

Share post:

Popular

More like this
Related

Paulo Dybala dismisses transfer rumors: “I always give my all for this team.”

Roma attacker Paulo Dybala scored a key brace in...

Stewart Cink commits to playing PGA Tour Champions in 2025

ORLANDO – After missing the cut at the Wyndham...

Tottenham vs Liverpool LIVE: Team news and line-ups as Arne Slot’s side seek to remain top of Premier League

Tottenham host Liverpool in the final Premier League fixtures...

Cheese the day: A rarely discounted ‘Shark Tank’ pizza container is down to $22

If you order pizza and end up with leftovers,...