The most sought-after resource in the tech industry right now isn’t a specific type of engineer. It’s not even money. It’s an AI chip made by Nvidia called the H100.
Securing these GPUs is “considerably harder to get than drugs,” Elon Musk has said. “Who’s getting how many H100s and when is top gossip of the valley rn,” OpenAI’s Andrej Karpathy posted last week.
I’ve spent this week talking with sources throughout the AI industry, from the big AI labs to cloud providers and small startups, and come away with this: everyone is operating under the assumption that H100s will be nearly impossible to get through at least the first half of next year. The lead time for new orders, if you can get them, is roughly six months, or an eternity in the AI space. Meanwhile, the cloud providers are just starting to make their H100s widely available and charging an arm and a leg for the limited capacity they have. For the most part, these hosting providers also require extremely costly, lengthy upfront commitments.
Start your free trial now to continue reading
This story is exclusively for subscribers of Command Line, our newsletter about the tech industry’s inside conversation. Subscribe to a plan below for full access.
Already a subscriber?Sign in
We accept credit card, Apple Pay, and Google Pay. Having issues?Click here for FAQ