Technology

Nvidia’s $20B Groq Deal Just Made Every AI Chip Startup a Whole Lot More Interesting

Nvidia doesn’t usually make big moves on Christmas Eve, but this year they dropped a bombshell: a $20 billion deal to license Groq’s technology and bring on most of their team, including CEO Jonathan Ross.

And just like that, the AI chip market got a lot more crowded—and a lot more valuable.

For years, Nvidia’s GPUs have been the undisputed kings of AI. If you were training massive language models or running AI applications, you bought Nvidia chips. End of story. But this deal is basically Nvidia admitting something everyone’s been quietly talking about: for the next phase of AI—running already-trained models at massive scale, what’s called “inference”—maybe GPUs aren’t the only answer.

That admission just made life a whole lot more interesting for a bunch of AI chip startups that have been building alternatives.

The Startups That Just Got a Huge Boost

Companies like Cerebras, D-Matrix, and SambaNova have been working on chips specifically designed for AI inference. They trade some of the flexibility of Nvidia’s GPUs for speed and efficiency when running AI models. And until now, they’ve been fighting an uphill battle against the perception that Nvidia was all you needed.

The Groq deal changed that overnight.

“I’m sure D-Matrix is a pretty happy startup right now,” said Karl Freund, founder of Cambrian-AI Research. “I suspect their next round will be at a much higher valuation.” D-Matrix just raised $275 million at a $2 billion valuation last month. That number’s probably going to look quaint pretty soon.

Cerebras is another one to watch. They make these absurdly large “wafer-scale” chips—literally the size of a dinner plate—designed to run huge AI models on a single piece of silicon. They’ve filed for an IPO, but Freund thinks they might get acquired first. “You don’t want to wait until after the IPO, when it’s more expensive,” he said. “From that perspective, Cerebras is sitting pretty right now.”

Then there’s the Intel-SambaNova situation. Word is that Intel has signed a term sheet to acquire SambaNova, which would be another major validation of the inference chip market.

Why This Actually Matters

Sid Sheth, CEO of D-Matrix, put it pretty well: “When [the Nvidia-Groq deal] happened, we said, ‘Finally, the market recognizes it.’ I think what Nvidia has really done is they said, Okay, this approach is a winning approach.”

Andrew Feldman, CEO of Cerebras, was even more blunt. He posted on X that Nvidia’s dominance used to act like a moat, keeping chip startups from taking market share. But that moat’s gone now. “The inference market is fragmenting, and a new category has emerged where speed isn’t a feature—it’s the entire value proposition. A value prop that can only be achieved by a different chip architecture than the GPU.”

It’s not just chip companies, either. AI inference software platforms like Etched, Fireworks, and Baseten are suddenly looking a lot more attractive as acquisition targets. Fireworks raised $250 million at a $4 billion valuation in October—not bad for a company most people outside the AI world have never heard of.

Not Everyone’s Buying the Hype

Of course, not everyone thinks every startup in this space is going to win big.

Matt Murphy, a partner at Menlo Ventures, pointed out that chip investing is brutal. “A lot of VCs stopped investing in chips 10 or 15 years ago,” he said. “It’s capital-intensive; it takes years to get a product out; and the outcomes are hard to predict.”

He’s got a point. Building chips requires massive amounts of capital and years of development before you have anything to sell. And right now, it’s hard to tell which companies have real technical advantages versus which ones are just riding the wave of enthusiasm.

“It’s hard to tell who’s really got something significant versus the tide is [raising] all boats, which is what seems to be going on,” Murphy said. He thinks consolidation is coming—meaning a lot of these startups are probably going to get acquired rather than going public.

The Truly Disruptive Play

Then there’s Naveen Rao, who thinks even the current crop of inference chip startups aren’t disruptive enough.

Rao—who founded MosaicML and was SVP of AI at Databricks—just left to start Unconventional AI. They raised a absolutely massive $475 million seed round last month led by Andreessen Horowitz and Lightspeed Ventures. His argument? Companies like Groq, D-Matrix, and Cerebras are optimizing within the existing digital computing paradigm. They’re better, but they’re not fundamentally different.

“We’ve been building the same fundamental machine for 80 years, a numeric digital machine,” Rao said. His vision is to build hardware that works completely differently—exploiting the physical behavior of silicon itself, with neural networks redesigned to match.

It’s a radical bet. And Rao admits it could take five years or more to pay off. But his reasoning is compelling: in a few years, 95% of all compute will be used for AI. If that’s true, maybe we need an entirely different kind of machine.

What Happens Next

The Nvidia-Groq deal has fundamentally shifted the conversation around AI chips. What was once a niche argument—that specialized inference chips could challenge Nvidia’s dominance—is now mainstream thinking.

For the startups in this space, that’s huge. Their valuations are going up, acquisition offers are probably coming in, and the market has validated their approach. For investors who bet early on these companies, it’s looking like a pretty smart move.

But Murphy’s probably right that we’re headed for consolidation. Not every inference chip startup is going to make it. Some will get acquired. Some will IPO. And some will probably run out of money before they can prove their technology works at scale.

The real test isn’t whether these companies can build better chips—many of them already have. It’s whether they can navigate the brutal economics of the semiconductor business, scale production, and convince customers to bet on them instead of just buying more Nvidia GPUs.

Nvidia’s Christmas Eve surprise made that challenge a lot more manageable. But it’s still a challenge.

Related posts
Technology

Samsung's Bringing 15 Startups to CES—And Most of Them Aren't From Seoul

Samsung’s heading to CES 2026 with something a little different this year: 15 startups from…
Read more
Technology

Former Apple Face ID Engineers Just Raised $107M to Give Robots a "Visual Brain"

Remember Face ID? The technology that lets your iPhone recognize your face even in the dark, even…
Read more
Technology

This Robotic Puppy Might Be the Answer to Senior Loneliness—And It's Going Back to CES

There’s something both heartbreaking and hopeful about a robotic puppy designed to keep…
Read more
Newsletter
Become a Trendsetter

Sign up for TheTechly’s Daily Digest and get the latest and trending technology updates.

[mc4wp_form id="729"]