20VC: OpenAI and Anthropic Will Build Their Own Chips | NVIDIA Will Be Worth $10TRN | How to Solve the Energy Required for AI... Nuclear | Why China is Behind the US in the Race for AGI with Jonathan Ross, Groq Founder

The Twenty Minute VC October 03, 2025 81 min
artificial-intelligence ai-infrastructure startup investment generative-ai nvidia anthropic google
85 Companies
187 Key Quotes
5 Topics
3 Insights

🎯 Summary

Comprehensive Podcast Summary: Jonathan Ross on AI’s Future Landscape

Focus Area

This episode covers the AI infrastructure landscape, focusing on chip development, market dynamics, energy requirements, and geopolitical implications. Key technologies discussed include TPUs, HBM (High Bandwidth Memory), inference compute, and nuclear energy solutions for AI workloads.

Key Technical Insights

Supply Chain Bottlenecks: NVIDIA’s monopsony on HBM creates artificial scarcity - they could produce 50M GPU dies annually but are limited to 5.5M GPUs due to HBM constraints • Speed’s Critical Value: Every 100ms of latency improvement drives 8% conversion rate increases; speed determines brand affinity and user engagement more than raw capability • Chip Amortization Reality: While industry uses 3-5 year depreciation cycles, Ross advocates 1-year upgrade cycles due to rapid performance improvements making older chips economically unviable

Business/Investment Angle

Insatiable Compute Demand: OpenAI and Anthropic could nearly double revenue within a month if given 2x current inference compute capacity due to existing rate limits • NVIDIA’s $10T Trajectory: Ross predicts NVIDIA could reach $10 trillion valuation within 5 years based on sustained compute demand and market positioning • Vertical Integration Imperative: Major AI companies will build custom chips not for cost savings, but for supply chain control and strategic independence from NVIDIA’s allocation decisions

Notable Companies/People

Key Players: Jonathan Ross (Groq founder, former Google TPU architect), OpenAI, Anthropic, NVIDIA, Google, Microsoft, Amazon Emerging Dynamics: Tesla’s Dojo cancellation illustrates chip development risks; only 35-36 companies generate 99% of AI token revenue, showing extreme market concentration

Future Implications

The industry is moving toward a bifurcated model where compute access determines AI leadership. Countries and companies controlling energy and compute infrastructure will dominate AGI development. The current “oil drilling” phase with many failures and few successes will evolve into predictable science, reducing investor returns but increasing deployment reliability. Nuclear energy emerges as the critical enabler for scaling AI compute requirements.

Target Audience

Primary: AI/ML executives, chip industry professionals, venture capitalists focused on infrastructure Secondary: Technology strategists and policy makers concerned with AI competitiveness


Comprehensive Analysis

This 81-minute conversation with Groq founder Jonathan Ross provides a masterclass in AI infrastructure economics and strategic positioning. Ross, who previously architected Google’s TPU, offers unique insights into the complex dynamics shaping the AI chip landscape.

The Central Thesis: We’re witnessing an unprecedented economic phenomenon where adding compute capacity directly translates to increased economic output through AI capabilities. This has never occurred in economic history, creating massive investment opportunities alongside significant risks.

Market Dynamics and Bubble Analysis: Rather than asking whether AI represents a bubble, Ross reframes the question around “smart money” behavior. When Microsoft deploys GPUs but refuses to rent them via Azure because internal usage generates higher returns, it signals genuine value creation. The market resembles early oil drilling - highly lumpy with many failures but spectacular successes for those with good instincts.

The NVIDIA Ecosystem: Ross reveals NVIDIA’s sophisticated market control through HBM monopsony power. While NVIDIA could theoretically produce 10x more GPU dies, memory constraints limit actual production. This creates artificial scarcity that NVIDIA leverages for pricing power and customer allocation control. The company’s $100B investment in OpenAI, which largely returns as chip purchases, isn’t an “infinite loop” since 40% flows to suppliers, creating real economic activity.

Technical Performance Imperatives: Speed emerges as the critical differentiator, not just computational capability. Ross draws parallels to consumer goods, where faster-acting products (tobacco, soft drinks) command premium margins due to dopamine response cycles. In AI applications, this translates to user engagement and conversion rates, making latency optimization economically crucial.

Strategic Vertical Integration: Major AI companies will inevitably develop custom chips, not primarily for cost advantages but for supply chain independence. Ross’s Google experience illustrates this - they built 10,000 AMD servers knowing Intel would win, purely to negotiate better Intel pricing. Custom chip development provides allocation control and strategic flexibility, even if the resulting chips prove inferior to NVIDIA’s offerings.

Energy and Geopolitical Implications: The conversation touches on nuclear energy as the solution for AI’s massive power requirements, with the prescient observation that “countries controlling compute will control AI, and you cannot have compute without energy.” This positions energy infrastructure as a national security imperative.

Investment and Amortization Realities: While industry standard depreciation runs 3-5 years, Ross advocates much shorter cycles due to rapid performance improvements. The key insight involves two-phase chip economics: initial deployment must cover capex, but continued operation only needs to beat opex. This creates situations where older chips remain profitable despite being technologically obsolete.

Market Concentration and Future Evolution: With only 35-36 companies generating 99% of AI revenue, the market shows extreme concentration typical of emerging technologies. Ross predicts evolution from current “vibe investing” toward scientific predictability, reducing investor returns but increasing deployment success rates.

This conversation matters because it provides insider perspective on the infrastructure layer enabling the AI revolution,

🏢 Companies Mentioned

Cerebras ai_infrastructure
Databricks ai_infrastructure
Rapid ai_application
Lovable ai_application
Intel ai_infrastructure
AMD ai_infrastructure
Facebook big_tech
Vanta ai_application
Robinhood fintech
Coinbase fintech
Brax fintech
Grammarly ai_application
Coda ai_application
Goldman Sachs financial
McLaren other

💬 Key Insights

"I think over time, we're going to realize that LLMs are the telescope of the mind that right now they're making us feel really, really small. But in a hundred years, we're going to realize that intelligence is more vast than we could have ever imagined. And we're going to think that's beautiful."
Impact Score: 9
"That Nvidia's software is a moat. Coulda lock in its bullshit. Yeah. It's true for training, but it's not true for inference."
Impact Score: 9
"We don't have the same supply chain constraints. We can build more compute than anyone else in the world. The most finite resource right now, compute. The thing that people are bidding up and paying these high margins for, we can produce nearly unlimited quantities of."
Impact Score: 9
"Would you rather invest in OpenAI at 500 billion or Anthropic at 180? I'd want to invest in both. Would you? Yeah. They're both undervalued, highly undervalued."
Impact Score: 9
"We do tell them you must use AI. Because otherwise, you're just not going to be competitive. But we saw them using source graph. We saw them then using Anthropic. We saw them then using codex. Next month, it'll probably be source graph again. It just keeps going around and around in a circle."
Impact Score: 9
"The most valuable thing in the economy is labor. And now we're gonna be able to add more labor to the economy by producing more compute and better AI. That has never happened in the history of the economy before."
Impact Score: 9

📊 Topics

#artificialintelligence 222 #aiinfrastructure 44 #startup 23 #investment 5 #generativeai 3

🧠 Key Takeaways

💡 have open models to allow for China to distill in the effective ways that they have done already? The model itself is not a clear advantage
💡 price so that our supply met our demand

🤖 Processed with true analysis

Generated: October 03, 2025 at 12:08 PM