Pushing compute to the limits of physics
🎯 Summary
Podcast Summary: Pushing Compute to the Limits of Physics
This 83-minute episode features a highly technical discussion between host Maxel Ramsted and guest Guillain Valdon (also known online as Beth J. Zos), founder of Extropic, a company pioneering thermodynamic computing. The conversation traces Valdon’s intellectual journey from theoretical physics and quantum gravity to the development of novel, physics-based hardware accelerators for probabilistic inference.
1. Focus Area
The primary focus is on next-generation computing paradigms, specifically contrasting Quantum Computing and Thermodynamic Computing as alternatives to classical digital computation, particularly for complex probabilistic tasks in AI/ML (like Markov Chain Monte Carlo - MCMC). The discussion heavily integrates concepts from theoretical physics, quantum information theory, and complex systems.
2. Key Technical Insights
- The Failure of Reductionism in Physics: Valdon argues that the traditional reductionist approach in physics (seeking simple, few-parameter models) is failing for complex systems. The future lies in a complexism approach, viewing the universe as a massive, self-simulating quantum computer, necessitating computation that mirrors this complexity (e.g., tensor networks, deep learning).
- Quantum Computing’s Energy Bottleneck: While powerful, quantum computing requires extreme energy expenditure to maintain near-zero entropy (quantum coherence) against environmental noise, necessitating massive overhead for error correction.
- Thermodynamic Computing as a “Hotter” Alternative: Extropic’s approach harnesses the exotic stochastic physics of electrons to build computers that operate closer to thermal equilibrium. This allows them to naturally execute probabilistic algorithms like MCMC much more energy-efficiently than digital emulation, essentially “letting go of the tight grip” on determinism.
3. Business/Investment Angle
- Inefficiency of Current AI Stacks: There is a fundamental mismatch: modern probabilistic AI software (like Transformers with softmax layers or MCTS) is run on hardware optimized for determinism, leading to massive energy inefficiency when reintroducing stochasticity via software.
- Market for Specialized Accelerators: Thermodynamic computing targets the intractable emulation of MCMC algorithms, offering orders of magnitude speed-up and energy efficiency gains for probabilistic inference tasks, creating a niche market for physics-based accelerators.
- The “Letting Go” Mindset: Similar to how deep learning required researchers to let gradient descent take control over explicit programming (Software 1.0), hardware development must embrace noise and stochasticity rather than fighting it to unlock new computational regimes.
4. Notable Companies/People
- Guillain Valdon (Beth J. Zos): Founder of Extropic; former researcher at Alphabet working on quantum computing and TensorFlow Quantum; proponent of effective accelerationism.
- Extropic: Company pioneering thermodynamic computing, focusing on stochastic hardware for probabilistic inference.
- Google DeepMind (Sponsor): Mentioned Gemini 2.5 Flash as an example of balancing model intelligence, speed, and cost in current AI development.
- Theoretical Physics Icons: Feynman and Stephen Hawking were cited as early influences on Valdon’s pursuit of a “theory of everything.”
5. Future Implications
The conversation suggests a future where computation moves away from absolute determinism. Instead of fighting entropy (as in classical and quantum computing), future accelerators will harness natural physical processes (stochastic dynamics) to solve problems that are inherently probabilistic. This shift implies a diversification of hardware substrates beyond silicon CMOS and superconducting qubits, moving toward specialized, physics-native accelerators for specific computational classes.
6. Target Audience
This episode is highly valuable for AI/ML researchers, hardware architects, quantum computing specialists, and technology investors interested in the fundamental limits of computation, novel accelerator design, and the intersection of physics and advanced AI.
🏢 Companies Mentioned
đź’¬ Key Insights
"Right? And so, you know, to me, it's like, if we're optimistic about the future, we tend to steer things towards that optimistic outcome."
"And if we, if we look at very negative outcomes and we're obsessed with them, we will drive whatever, whatever system we're thinking about towards those negative outcomes."
"But then the dual of that is taking actions in the world to minimize divergence, uh, between the world and your predictive model of it."
"in active inference, uh, you know, you have perception and action, right? You can update your, you know, based on your input, um, sensory information, you can update your model of the world. So you're minimizing divergence of your model of your, your own internal model to the statistics of the world."
"And, and it's, we don't have as much control and interpretability, we don't understand the world as well, right? We kind of have to, you know, pardon my French, but fuck around and find out, the FaFO algorithm, right?"
"it's, it's, it's, it's basically infinite time horizon, right? And so actually blowing up the planet does, uh, you know, ex- you know, burn up a bunch of free energy. But in the long term, you actually, it's, it's, you know, the same reason life exists. Uh, it's, it's much better to sort of conserve and strategically use free energy to secure more free energy and, and keep growing rather than have some order rather than just like burn all and go and have chaos, right?"