Tech Tonic | While Nvidia keeps the world regaled, a near-monopoly brews
Amazon Web Services and Google Cloud rely heavily on Nvidia GPUs for AI infrastructure, while Microsoft too is a large buyer of its high-end AI chips.
Barely a few days have passed since Nvidia showed off Cosmos, which it says is “the world’s first world foundation model.” Quite what is a world foundation model, you might ask. Quite simply, it defines AI models that try to imbibe how humans make mental models of the world around them, to similarly predict and generate “physics-aware” videos. Robotics and autonomous motoring are seen as the core areas of relevance.

“The ChatGPT moment for robotics is coming. Like large language models, world foundation models are fundamental to advancing robot and AV development, yet not all developers have the expertise and resources to train their own. We created Cosmos to democratize physical AI and put general robotics in reach of every developer,” Jensen Huang, founder and CEO of NVIDIA, said during the keynote. He’s made a pivotal point. Till this announcement, there weren’t many AI companies that actively talked about world foundation models, as the next evolution of AI for spatial context.
Nvidia has done this well. They’ve given developers access to Cosmos Nano, Super and Ultra models, already. Multiple models in play, ranging in size from 4 billion to 14 billion parameters. Robotics and automotive companies including 1X, Agile Robots, Agility, Figure AI, Foretellix, Fourier, Neura Robotics, Skild AI, Virtual Incision, Uber and XPENG, are already on board for finding relevance with Cosmos. The next few months will be interesting.
Think about this. There aren’t many tech or AI companies that are pivotal to pretty much every AI evolution, as perhaps Nvidia has proved to be over the past couple of years. Core to that has been Nvidia’s chips, often the starting point to build any computing application, service or platform. That is where Nvidia has timed its domination, based on pristine chips which barely ever put a foot wrong. Cloud service providers, data centre operations, enterprises, large tech companies and indeed AI startups have been using Nvidia’s hardware to run their services, scale and train AI models for different, often customised applications.
Take this as an example — Amazon Web Services (AWS), Microsoft Azure, and Google Cloud rely heavily on Nvidia GPUs for their AI infrastructure and services and are among the largest buyers of Nvidia's high-end AI chips like the A100 and H100. There have been times in the past year and a half, when Nvidia couldn’t manufacture these chips quickly enough. The H100 in particular, when in short supply, often went for a significant premium. That was before the $30,000 (around ₹25,00,600) successor, the GB200 Grace Blackwell superchip, which Huang then referenced as an “engine to power this new industrial revolution.”
Early last year, I talked about the AI chip wars with Intel, AMD, and Qualcomm more on the consumer side of things, while Microsoft, Amazon, Meta and Google show intent to build their own hardware. The reality as it has unfolded, a few months later is, that their efforts don’t have the urgency or scale at which they can really compete with Nvidia.
There is precious little to be heard about Microsoft’s Maia and Cobalt chips since them being opened for public preview in the summer of last year. In fact, analysts estimate that Microsoft bought as many as 485,000 of Nvidia’s ‘Hopper’ chips through 2024.
But there is hope for competition in the space. Broadcom, which is making the Tensor Processing Units for Google, may just be in the right place at the right time. Omdia Research estimates that Google's Tensor Processing Units (TPUs) could account for between $6 billion and $9 billion in AI semiconductor revenue for Broadcom in 2025.
Nvidia too would be hoping for some of that competition which promises to turn up but hasn’t thus far. Late last year, the U.S. Department of Justice opened an antitrust probe on Nvidia, considering the chipmaker has more than 90% share of the datacenter GPU market. Not that Nvidia has done anything wrong (though there are allegations they made it difficult for customers to other chips, as well as differential pricing for customers who use other vendors’ chips too) until there is evidence that they may have, but such dominance at a time when AI’s relevance is what it is, is bound to get noticed.
AMD, Intel, Qualcomm, Samsung and Amazon, also make GPUs for data centers.
Then in December, Chinese regulators launched an investigation into potential monopolistic practices by Nvidia. We cannot completely discount any political cues to this either, perhaps as a response to the U.S. depleted more curbs on exports to 140 Chinese companies, including chip equipment makers. Nvidia had, in 2022 developed China-spec versions of the A100 and H100 AI chips, something that was workable in the contours of the trade war at the time. These chips too were restricted by new guidelines in 2023. That led to further modifications.
Nvidia, in China too, reportedly had more than 90% share of the AI chip market in China before the curbs. Huawei has tried to step into that void.
Nvidia eyes personal computing
Spoiler alert. Alongside Cosmos, Nvidia also has a first glimpse at Project DIGITS. This is a $3,000 personal supercomputer with 1,000x the power of an average laptop. We’ll have to wait till the summer for this “AI personal supercomputer” concept to become a reality. It’ll be powered by Grace Blackwell Superchip, 128GB unified memory, a 4TB SSD storage, and an ability to run models of up to 200 billion parameters.
Nvidia’s personal computing foray beyond the graphics chips was anticipated, but the sheer weight of these plans would surely worry Qualcomm’s new-found mojo with the Snapdragon X chips, AMD’s attempts to rediscover the form of old, and Intel most certainly at a time when precious little is going the chipmaker’s way.
Just Nvidia’s way of dropping the mic on the way out? They did make a point to mention DIGITS will “deliver powerful performance using only a standard electrical outlet.” They’re still looking at it more for enterprise-specific scenarios considering the power on tap, but you know for sure, that enthusiasts will be lining up with the cash. Why do I reference this when talking about potential monopolies? Nothing from Qualcomm, Intel, AMD or anyone else, has come close to this sort of a leap in possibility. And nothing indicates to me that any of Nvidia’s competitors are even ready.
Vishal Mathur is the technology editor for HT. Tech Tonic is a weekly column that looks at the impact of personal technology on the way we live, and vice-versa. The views expressed are personal.
Discover unbelievable discounts on laptops, TVs, washing machines, refrigerators, and more. Celebrate Republic Day with massive savings on home appliances, furniture, gadgets, beauty & health essentials, and more during Amazon sale.
Discover unbelievable discounts on laptops, TVs, washing machines, refrigerators, and more. Celebrate Republic Day with massive savings on home appliances, furniture, gadgets, beauty & health essentials, and more during Amazon sale.