Nvidia Just Spent $20 Billion on Groq and Nobody Can Figure Out Why

So Nvidia dropped $20 billion on Groq right before Christmas. The same Groq that was supposed to be Nvidias competition. The same Groq that built chips specifically designed to NOT be GPUs.
Make it make sense.
The Deal Structure Is Weird
Heres the thing – this isnt a normal acquisition. CNBC reports its technically an “asset acquisition” with a “non-exclusive licensing agreement.” What does that mean? Groqs cloud platform (GroqCloud) stays independent. CEO Jonathan Ross and President Sunny Madra move to Nvidia. CFO Simon Edwards becomes the new CEO of the cloud business.
Its like buying a restaurant but letting the previous owner keep running the kitchen under a different name.
TechCrunch notes Nvidia paid almost triple what Groq was worth three months ago. The company raised at a $6.9 billion valuation in September. Now theyre getting $20 billion. Somebody at Nvidia really wanted this done.
Why This Actually Matters
Groq makes Language Processing Units – LPUs, not GPUs. Theyre optimized for inference, meaning theyre fast at running AI models but not training them. Think of it as the difference between teaching someone to cook versus actually making dinner. Nvidias bread and butter has been training.
Bernstein analyst Stacy Rasgon called the deal structure “maintaining the fiction of competition alive.” The FTC already blocked Nvidias attempted Arm deal. This weird asset-licensing hybrid probably helps avoid similar scrutiny.
The timing matches Nvidias September acquisition of Enfabrica for around $900 million. Theyre buying up AI infrastructure plays across the board now.
What This Means for the AI Chip War
Nvidia already dominates AI training with something like 80% market share. Adding Groqs inference tech means they now have answers for the whole pipeline. Companies like AMD and Intel just got more to worry about.
OpenAIs massive funding round and Nvidias shopping spree tell the same story – AI infrastructure is where the money is going. And going fast.
Ross and Madra joining Nvidia means the brains behind LPU architecture are now working for the GPU king. Whether thats good for innovation overall… well thats a different question.
Twenty billion dollars. Three days before Christmas. Welcome to the AI gold rush.
