Normalizing Flows, Transport Maps, and Invertible Neural Networks

Seminar Recording

In this Data Science Under the Hood webinar, Dr Robert Salomone explores Normalizing Flows, Transport Maps, and Invertible Neural Networks

About this event

*** This is a hybrid style event with in person (GP-P512) and online attendance welcome. A Zoom link will be emailed to registrants on the day of the event. ***

In this presentation I will give a lively overview of some topics of increasing interest in Machine Learning and Statistics in recent years – namely, models for probability distributions that can be posited in the form of some (possible composition of) invertible transformation of a simple “base’’ random vector. For appropriately chosen transformations, the resulting models have both tractable computation of their likelihood via the famous “Change of Variables Theorem’’ (for which judicious choice of transforms allow scaling to thousands, or even millions of dimensions), as well as a very large amount of model flexibility. This makes them an invaluable tool of interest to statisticians and computer scientists alike. Recently, interest has shifted to positing transformations that themselves directly take the form of neural networks that are constrained to be invertible, which will be briefly discussed. Several examples will be provided, ranging from the early occurrence of such ideas for simple univariate distributions, to the more modern approaches in computer science which carefully employ neural networks as a component in the construction of flexible transformations (which has been an area of research in generative modelling) . Python implementations of more sophisticated transformations using the Pyro probabilistic programming language will also be briefly demonstrated.

About the presenter

Robert is a postdoctoral fellow at QUT’s Centre for Data Science, working with the models and algorithms domain. Prior to his time at QUT, Robert was a research fellow at UNSW Sydney, as well as The University of Queensland, where he also completed his PhD on the topic of Advanced Monte Carlo methods in 2018. His research interests are at the intersection of statistics and machine learning, often involving an eclectic mix of mathematical ideas and concepts. His work has to date has included topics such as efficient methods for Monte Carlo integration in challenging problems, Bayesian inference for big data sets in the time series setting, kernel methods, Markov Chain Monte Carlo, variational inference, and rare-event simulation for problems arising in applied probability and combinatorics.

 

Details:

Location: GP-P512 and online
Start Date: 30/11/2021 [add to calendar]
Start Time: 2pm
End Date: 30/11/2021
End Time: 3pm (AEST)
Register: