scorecard
  1. Home
  2. tech
  3. news
  4. The hottest commodity in AI right now isn't ChatGPT — it's the $40,000 chip that has sparked a frenzied spending spree

The hottest commodity in AI right now isn't ChatGPT — it's the $40,000 chip that has sparked a frenzied spending spree

Hasan Chowdhury   

The hottest commodity in AI right now isn't ChatGPT — it's the $40,000 chip that has sparked a frenzied spending spree
  • ChatGPT isn't the hottest commodity in AI right now — it's a chip named Hopper.
  • Nvidia's Hopper, also known as H100, is the workhorse that drives the AI models behind ChatGPT.

Move aside, ChatGPT. There's a new hot commodity in AI: a $40,000 workhorse chip that goes by the name of Hopper.

Named after pioneering computer scientist Grace Hopper, it's an ultra-expensive little chip, which is officially called the H100 tensor core GPU and is the hot property of Silicon Valley giant Nvidia. Hopper is sought after by just about everyone in the tech sector.

Why? It's simple, really: H100 is a processor that is almost single-handedly driving the generative AI revolution, giving large language models like those underlying ChatGPT the computing power needed to process the billions of parameters that shape their output.

Without them, the advancement of AI models risks grinding to a halt. It also explains why there's been a mad dash to get a hold of H100 processors as the threat of shortage jolts buyers into action.

The latest development in the AI gold rush comes from the Gulf region as Saudi Arabia and the UAE scramble to get their hands on Nvidia's processors. The two states are buying up thousands of H100 units, per the Financial Times, as they ramp up their AI ambitions.

In startup land, where so much of the pioneering work in AI is taking place, the lack of supply of these GPUs has left wealthy VCs so concerned about the risk their startups will fall behind without them that they've started to buy units as and when they can on their behalf.

On the flip side, The Verge reported that the startup CoreWeave, backed by Nvidia, has put a bunch of H100 GPUs it owns up as collateral to secure a $2.3 billion loan, which will give it the war chest to buy more H100 units.

Such is the demand, Reuters reported in June that US sanctions preventing the export of Nvidia's top chips to China and Hong Kong had helped create an underground market.

This is the culmination of a crazed few months of spending by "everyone and their dog," as Elon Musk once put it, as they seek the tech that could send their AI progress into warp speed.

Insider reported in April this year that Musk himself had bought roughly 10,000 GPUs as part of his AI-focused attempt to transform X into a super app. Musk has also since announced his own AI company, xAI, as a counter to OpenAI.

These extraordinary maneuvers to purchase chips for generative AI make it hard to overstate just how valuable they are.

Sure, OpenAI's chatbot has been the face of the new AI age since its launch, but Nvidia's processors – a key driver of the company's ascent to a $1 trillion market capitalization in May – offer a staying power that ChatGPT does not yet enjoy. Especially as it's getting dumber.

It will only get crazier from here: Last week, Nvidia unveiled a next-gen version of the H100 called the GH200, which is slated for launch in the second quarter of next year.




Advertisement