The Race to Unseat NVIDIA Is On

And an AI that does your doomscrolling for you

In partnership with

Gladly Connect Live '26. May 4–6 in Atlanta.

AI has everyone talking. Not everyone has answers. At Gladly Connect Live, CX leaders from Condé Nast, Smith Optics, and more share exactly how they moved AI from pilot to production, the timeline, the systems, the QA loops. 13+ sessions built for the moment we're all in. For CX and ecommerce leaders. Atlanta, May 4–6. Space is limited, secure your spot now.

The Race to Unseat NVIDIA Is On

Google used Cloud Next '26 to unveil TPU 8t and TPU 8i, its eighth-generation AI chips split between training and inference. The company is framing the move as part of a broader push around the "agentic enterprise" -- not just faster silicon, but a fuller cloud stack for model builders and corporate buyers. (blog.google)

That is what makes this more interesting than a normal product launch. Google is not just adding another cloud feature -- it is trying to chip away at Nvidia's hold on AI infrastructure economics.

Basically, Google Cloud is launching two new AI chips to compete with Nvidia. The real question now is whether customers actually move workloads, or whether this becomes another impressive demo that mostly helps Google show its investors that it’s still on the cutting edge.

Tom’s Take

As the old adage goes, during a gold rush it’s not the miners who get rich…it’s the shovel sellers.

NVIDIA is the quintessential “shovel seller” of our current AI gold rush. The company sells the chips on which nearly every AI company depends. And that has made it fabulously valuable.

Unsurprisingly, other companies want a piece of that AI pie. Making cutting-edge chips is hard though; a single piece of chip-making equipment can cost hundreds of millions of dollars.

Who has the resources to compete? Other big tech companies. That’s why we’re seeing Google enter the race. Others will surely follow.

Other AI News

  • DeepSeek previewed V4 Flash and V4 Pro, saying both models support 1 million-token context windows and come in at aggressive price points. The pitch is simple: close the gap on frontier performance while keeping costs low.

  • Thinking Machines Lab keeps pulling in Meta talent, and it is also benefiting from a multibillion-dollar Google cloud deal that gives it access to Nvidia GB300-based systems. That is the AI race in one sentence: people, compute, and who can pay for both. (techcrunch.com)

  • OpenAI published a child safety blueprint. It is a reminder that the safety work around frontier models is getting more concrete, and more public. (openai.com)

Today's Strangest AI

Noscroll is an AI bot that doomscrolls for you, then texts you when it finds something worth your attention. We have now automated one of the least admirable human habits, which is either efficient or deeply on brand.