Nov 22 – 25, 2021
Kloster Irsee, https://www.kloster-irsee.de/home
Europe/Berlin timezone

Bayesian Reasoning with Trained Neural Networks

Nov 24, 2021, 9:55 AM
25m
Kloster Irsee, https://www.kloster-irsee.de/home

Kloster Irsee, https://www.kloster-irsee.de/home

Speaker

Jakob Knollmüller (ORIGINS Data Science Lab)

Description

In this talk we present how to combine independently trained neural networks to jointly solve novel tasks through Bayesian reasoning. Deep generative networks serve as prior distributions on complex systems and regression/classification networks are used to check whether certain features are present. Bayes Theorem allows us to then solve the inverse problem in terms of the latent variables of the generator to obtain the distribution of systems that are compatible with one or several posed constraints. We demonstrate how elaborate tasks can be formulated by imposing multiple constraints simultaneously. As Bayesian inference extends logic towards uncertainty, such questions are answered with reason. We show how this approach is compatible with state-of-the-art machine learning architectures with millions of trained weights and hundreds of latent parameters.
While traditional machine learning approaches might be better at one specific task, we do not have to train a dedicated network for everything. We flexibly com- pose appropriate networks from a library of building blocks and solve the associated Bayesian inference problem. Each of these building blocks is simple, serves a single pourpose, and can be reused.
The scope of questions we can approach in this fashion grows exponentially with the number of available building blocks. This potentially provides a path to reasoning systems that can flexibly answer complex questions as they emerge.

Primary author

Jakob Knollmüller (ORIGINS Data Science Lab)

Presentation materials