
#155 Probabilistic Programming for the Real World, with Andreas Munk
Learning Bayesian Statistics
Support & Resources
→ Support the show on Patreon
→ Bayesian Modeling Course (first 2 lessons free):
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work
Takeaways:
Q: Why is bridging deep learning and probabilistic programming so important?
A: Deep learning is extraordinarily good at fitting complex functions, but it throws away uncertainty. Probabilistic programming keeps uncertainty explicit throughout. Combining the two – as in inference compilation – lets you get the expressiveness of neural networks while still doing proper Bayesian inference.
Q: What is inference compilation and how does it relate to amortized inference?
A: Amortized inference is the general idea of training a model upfront so you don't have to run expensive inference from scratch every single time. Inference compilation is a specific form of amortized inference where a neural network is trained to propose good posterior samples for a given probabilistic program – essentially learning to do inference rather than computing it fresh each query.
Q: What is PyProb and what problems does it solve?
A: PyProb is a probabilistic programming library designed specifically to support amortized inference workflows. It lets you write probabilistic models in Python and then train inference networks on top of them, making methods like inference compilation practical for real-world simulators and scientific models.
Q: What are probabilistic surrogate networks and why do they matter?
A: A probabilistic surrogate network is a learned approximation of a complex, expensive simulator that preserves uncertainty. Instead of running a costly simulation thousands of times, you train a surrogate that can answer probabilistic queries much faster – crucial for applications like risk modeling where speed and uncertainty quantification both matter.
Chapters:
00:00:00 Introduction to Bayesian Inference and Its Barriers
00:03:51 Andreas Munch's Journey into Statistics
00:10:09 Bridging the Gap: Bayesian Inference in Real-World Applications
00:15:56 Deep Learning Meets Probabilistic Programming
00:22:05 Understanding Inference Compilation and Amortized Inference
00:28:14 Exploring PyProb: A Tool for Amortized Inference
00:33:55 Probabilistic Surrogate Networks and Their Applications
00:38:10 Building Surrogate Models for Probabilistic Programming
00:45:44 The Challenge of Bayesian Inference in Enterprises
00:52:57 Communicating Uncertainty to Stakeholders
01:01:09 Democratizing Bayesian Inference with Evara
01:06:27 Insurance Pricing and Latent Variables
01:16:41 Modeling Uncertainty in Predictions
01:20:29 Dynamic Inference and Decision-Making
01:23:17 Updating Models with Actual Data
01:26:11 The Future of Bayesian Sampling in Excel
01:31:54 Navigating Business Challenges and Growth
01:36:40 Exploring Language Models and Their Applications
01:38:35 The Quest for Better Inference Algorithms
01:41:01 Dinner with Great Minds: A Thought Experiment
Thank you to my Patrons for making this episode possible!