+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Nvidia GPUs are so hard to get that rich venture capitalists are buying them for the startups they invest in

Jun 14, 2023, 02:42 IST
Business Insider
Nat Friedman, the former CEO of GitHub, and investor Daniel Gross have bought thousands of Nvidia GPUs and are making them available to startups in the midst of a GPU shortage caused by an AI boom.GitHub
  • Nvidia GPUs are essential for training the big models behind ChatGPT and other generative AI tools.
  • These special chips are very expensive and in short supply.
Advertisement

Nvidia GPUs are the water that feeds today's flourishing AI ecosystem. Without the power that the A100 and H100 chips provide, ChatGPT and most other new generative AI services wouldn't exist.

With the boom in AI startups and services, there's so much demand for these crucial components that Nvidia, and manufacturing partner TSMC, can't make enough of them. While the demand has been great for Nvidia, which gained almost $200 billion in market cap in a single day last month after announcing higher revenue guidance for its GPU chips, it's been less fortunate for AI startups. Prices have soared and shortages are common.

This is giving Big Tech companies another huge advantage over smaller upstarts. If you really want to compete in this new AI world, you have to train your own models with your own data, which can require a large amount of GPUs. Otherwise, you will just be another app on someone else's platform.

Microsoft, and its partner OpenAI, knows this, Google does too. Even Adobe does. They are rushing to train large foundation models on huge amounts of data and have the benefit of billions of dollars to invest in this expensive process.

Many startups can't afford that, or they just can't get hold of the chips. Even tech juggernaut Microsoft is facing a hardware crunch, going as far as to ration internal access to GPUs to save processing power for its AI-powered Bing chatbot and AI Microsoft Office tools, The Information reported.

Advertisement

So, some venture capitalists are taking unusual steps to help.

Nat Friedman, the former CEO of GitHub, and Daniel Gross, who has backed GitHub, Uber, and a host of other successful startups, have purchased thousands of GPUs and set up their own AI cloud service.

Called Andromeda Cluster, the system has 2,512 H100 GPUs and is capable of training a 65 billion parameter AI model in about 10 days, the VCs said. That's not the largest model out there, but it's a sizeable one.

The catch: This is only available for startups backed by Friedman and Gross. Still, the move is being praised already.

"Individual investors doing more to support compute-intensive startups than most governments. (Very cool project!)," tweeted Jack Clark, a cofounder of AI startup Anthropic.

Advertisement
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article