Artwork

Content provided by Alexandre Andorra. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Alexandre Andorra or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

#107 Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt

1:21:37
 
Share
 

Manage episode 420989877 series 2827094
Content provided by Alexandre Andorra. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Alexandre Andorra or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


In this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.

Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks.

He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.

A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy.

In his free time, Marvin enjoys board games and is a passionate guitar player.

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.

Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

Takeaways:

  • Amortized Bayesian inference combines deep learning and statistics to make posterior inference fast and trustworthy.
  • Bayesian neural networks can be used for full Bayesian inference on neural network weights.
  • Amortized Bayesian inference decouples the training phase and the posterior inference phase, making posterior sampling much faster.
  • BayesFlow is a Python library for amortized Bayesian workflows, providing a user-friendly interface and modular architecture.
  • Self-consistency loss is a technique that combines simulation-based inference and likelihood-based Bayesian inference, with a focus on amortization
  • The BayesFlow package aims to make amortized Bayesian inference more accessible and provides sensible default values for neural networks.
  • Deep fusion techniques allow for the fusion of multiple sources of information in neural networks.
  • Generative models that are expressive and have one-step inference are an emerging topic in deep learning and probabilistic machine learning.
  • Foundation models, which have a large training set and can handle out-of-distribution cases, are another intriguing area of research.

Chapters:

00:00 Introduction to Amortized Bayesian Inference

07:39 Bayesian Neural Networks

11:47 Amortized Bayesian Inference and Posterior Inference

23:20 BayesFlow: A Python Library for Amortized Bayesian Workflows

38:15 Self-consistency loss: Bridging Simulation-Based Inference and Likelihood-Based Bayesian Inference

41:35 Amortized Bayesian Inference

43:53 Fusing Multiple Sources of Information

45:19 Compensating for Missing Data

56:17 Emerging Topics: Expressive Generative Models and Foundation Models

01:06:18 The Future of Deep Learning and Probabilistic Machine Learning

Links from the show:


Transcript

This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

  continue reading

123 episodes

Artwork
iconShare
 
Manage episode 420989877 series 2827094
Content provided by Alexandre Andorra. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Alexandre Andorra or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


In this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.

Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks.

He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.

A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy.

In his free time, Marvin enjoys board games and is a passionate guitar player.

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.

Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

Takeaways:

  • Amortized Bayesian inference combines deep learning and statistics to make posterior inference fast and trustworthy.
  • Bayesian neural networks can be used for full Bayesian inference on neural network weights.
  • Amortized Bayesian inference decouples the training phase and the posterior inference phase, making posterior sampling much faster.
  • BayesFlow is a Python library for amortized Bayesian workflows, providing a user-friendly interface and modular architecture.
  • Self-consistency loss is a technique that combines simulation-based inference and likelihood-based Bayesian inference, with a focus on amortization
  • The BayesFlow package aims to make amortized Bayesian inference more accessible and provides sensible default values for neural networks.
  • Deep fusion techniques allow for the fusion of multiple sources of information in neural networks.
  • Generative models that are expressive and have one-step inference are an emerging topic in deep learning and probabilistic machine learning.
  • Foundation models, which have a large training set and can handle out-of-distribution cases, are another intriguing area of research.

Chapters:

00:00 Introduction to Amortized Bayesian Inference

07:39 Bayesian Neural Networks

11:47 Amortized Bayesian Inference and Posterior Inference

23:20 BayesFlow: A Python Library for Amortized Bayesian Workflows

38:15 Self-consistency loss: Bridging Simulation-Based Inference and Likelihood-Based Bayesian Inference

41:35 Amortized Bayesian Inference

43:53 Fusing Multiple Sources of Information

45:19 Compensating for Missing Data

56:17 Emerging Topics: Expressive Generative Models and Foundation Models

01:06:18 The Future of Deep Learning and Probabilistic Machine Learning

Links from the show:


Transcript

This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

  continue reading

123 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide