Larry Ellison is the chairman of Oracle (NYSE: ORCL), which is currently building some of the fastest and most cost efficient data centers in the world for developing artificial intelligence (AI). Elon Musk, on the other hand, runs Tesla (NASDAQ: TSLA), which is building AI-powered self-driving software for its electric vehicles. He also runs SpaceX, X (formerly Twitter), and a new AI start-up called xAI.
Ellison and Musk need tens of thousands of graphics processors (GPUs) for their data centers in order to bring AI to life, and Nvidia (NASDAQ: NVDA) supplies the best chips in the industry.
At Oracle’s financial analyst meeting on Sept. 12, Ellison told the audience that he and Musk recently went to dinner with Nvidia CEO Jensen Huang at the Nobu restaurant in Palo Alto. The two, who are among the richest people on Earth, found themselves begging Huang for something money simply can’t buy at the moment. Here’s how it went down.
The arms race for GPUs
Oracle currently has 162 data centers either live or under construction, but it believes that number could eventually top 2,000 because the demand for computing power from AI developers is soaring. Some of Oracle’s largest data centers feature clusters of more than 32,000 GPUs, but next year the company will offer a cluster of 131,072 GPUs from Nvidia’s latest Blackwell lineup.
Oracle designed unique RDMA (random direct memory access) networking technology that can move data from one point to another more quickly than traditional Ethernet networks, and since developers pay for computing power by the minute, this can significantly reduce costs. That’s why leading AI start-ups like OpenAI, Cohere, and even Musk’s xAI are using Oracle’s infrastructure.
In its recent fiscal 2025 first quarter (ended July 31), the Oracle Cloud Infrastructure (OCI) segment generated $2.2 billion in revenue, a whopping 45% jump from the year-ago period. However, it could be growing even faster if not for supply constraints — in other words, Oracle simply can’t get its hands on enough GPUs for its data centers.
Not only is Oracle battling other cloud giants like Microsoft, Amazon, and Alphabet for GPU allocations from Nvidia, but tech companies like Tesla and Meta Platforms are also soaking up supply to develop AI for their own purposes. Tesla is trying to bring a cluster of 50,000 GPUs online this year to enhance its self-driving software, which requires a substantial amount of computing power.
Meta, on the other hand, used around 16,000 of Nvidia’s flagship H100 GPUs to train its Llama 3.1 large language model (LLM), but the company plans to increase its capacity to a mind-boggling 600,000 H100 equivalents by the end of this year. That will pave the way for Llama 4, which CEO Mark Zuckerberg says could set the benchmark for the industry in 2025.
Ellison and Musk are begging for more GPUs
Please take our money … take more of it. You’re not taking enough. … We need you to take more of our money. Please.
— Ellison’s and Musk’s comments to Jensen Huang over dinner, according to Ellison.
Ellison and Musk were practically begging Huang for more GPUs, but no amount of money in the world can buy the numbers they require right now because Nvidia simply can’t keep up with demand. Oracle and Tesla aren’t even Nvidia’s biggest customers!
Oracle spent $6.9 billion on capital expenditures (capex) during fiscal 2024 (ended April 30), and it expects to spend double that in fiscal 2025. Most of the money will go toward buying chips and building data centers. Tesla plans to spend over $10 billion on capex this calendar year on the whole, which will also go toward the 50,000 GPU cluster I mentioned earlier.
Those numbers are modest compared to what other tech giants are spending. Microsoft allocated $55.7 billion to capex during its fiscal 2024 (ended June 30), and it plans to spend even more in fiscal 2025. Amazon’s capex spending, on the other hand, could top $60 billion in calendar 2024.
Therefore, it’s no surprise that Nvidia generated $26.3 billion in data center revenue during its recent fiscal 2025 second quarter (ended July 28), a 154% increase from the year-ago period. Ellison says the wave of AI spending could continue for the next 10 years as companies and nation states battle for tech supremacy when it comes to AI, so Nvidia’s data center revenue probably has plenty of growth left in the tank.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $715,640!*
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than quadrupled the return of S&P 500 since 2002*.
*Stock Advisor returns as of September 16, 2024
John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Anthony Di Pizio has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Alphabet, Amazon, Meta Platforms, Microsoft, Nvidia, Oracle, and Tesla. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
You Won’t Believe What Larry Ellison and Elon Musk Said to Nvidia CEO Jensen Huang was originally published by The Motley Fool