Microsoft‘s massive investment in OpenAI has put the company at the center of the artificial intelligence boom. But it’s not the only place where the software giant is opening its wallet to meet the surging demand for AI-powered services.
CNBC has learned from people with knowledge of the matter that Microsoft has agreed to spend potentially billions of dollars over multiple years on cloud-computing infrastructure from startup CoreWeave, which announced on Wednesday that it raised $200 million. That financing comes just over a month after the company attained a valuation of $2 billion.
related investing news
CoreWeave sells simplified access to Nvidia’s graphics processing units, or GPUs, which are considered the best available on the market for running AI models. Microsoft signed the CoreWeave deal earlier this year in order to ensure that OpenAI, which operates the viral ChatGPT chatbot, will have adequate computing power going forward, said one of the people, who asked not to be named due to confidentiality. OpenAI relies on Microsoft’s Azure cloud infrastructure for its hefty compute needs.
Microsoft and CoreWeave both declined to comment.
The generative AI rush began late last year after OpenAI introduced ChatGPT to the public, demonstrating that AI can take human input and produce sophisticated responses. Many companies, including Google, have since rushed to add generative AI into their products. And Microsoft has been busy releasing chatbots for its own services, such as Bing and Windows.
With so much demand for its infrastructure, Microsoft needs additional ways to tap Nvidia’s GPUs. CoreWeave CEO Michael Intrator declined to comment about the Microsoft deal in an interview last month, but he said revenue has “gone up by many multiples from 2022 to 2023.”
CoreWeave’s announced funding on Wednesday from hedge fund Magnetar Capital was an extension of a $221 million round in April. Nvidia invested $100 million in the prior financing, Intrator said. CoreWeave was founded in 2017 and has 160 employees.
Nvidia’s stock price is up 170% this year. The company’s market cap briefly topped $1 trillion for the first time this week after it issued a forecast for the July quarter that was over 50% higher than Wall Street estimates.
The chipmaker’s growth will “largely be driven by data center, reflecting a steep increase in demand related to generative AI and large language models,” Colette Kress, Nvidia’s finance chief, said on last week’s earnings call. OpenAI’s GPT-4 large language model, trained with Nvidia GPUs on extensive online data, is at the core of ChatGPT.
Kress referred to CoreWeave by name on the call, and in March, Nvidia CEO Jensen Huang mentioned CoreWeave in his presentation at Nvidia’s GTC conference.
CoreWeave’s website claims the company can deliver computing power that’s “80% less expensive than legacy cloud providers.” Among other cards, CoreWeave offers Nvidia’s A100 GPUs, which developers can also find through the Amazon, Google and Microsoft clouds.
In addition, CoreWeave has available less expensive Nvidia A40 GPUs that are marketed for visual computing, while the A100 targets AI, data analytics and high-performance computing. Some CoreWeave clients have struggled to obtain enough GPU power on big clouds, Intrator said. At times prospects have asked for A100 or newer H100 GPUs from Nvidia, and the company has instead recommended A40 GPUs.
These “will do an excellent job at a very cost-effective price,” Intrator said.