Anand Chowdhary

Groq becomes strategic AI infrastructure

Nvidia basically just told the market Groq is worth ~$20B, three months after a $6.9B round. This is the market saying, “oh, ultra low latency inference isn’t a nice-to-have, it’s systemically strategic infrastructure now.” Groq’s IP is being priced less like a startup and more like a critical layer in the stack. Think “this must exist for the future of AI to work at scale” rather than “cool dev tool with ARR.” What happens next? Every remotely credible alternative to Nvidia in the inference path will start anchoring off this scarcity premium. Custom accelerators, new compilers, weirdly specific LPU-like chips that only five people understand, even some software-defined plays that squeeze more out of existing hardware. All of them now have a new reference point. So when you think about your own company’s value, you might want to ask: are you “just” SaaS, or are you sitting on something the ecosystem will treat as strategic infrastructure in a world where GPUs are the new oil? Because the market is starting to price that difference in, aggressively.