Arthur Gretton will describe Generalized Energy Based Models (GEBM) for generative modeling. These models combine two trained components: a base distribution (generally an implicit model, as in a Generative Adversarial Network), which can learn the support of data with low intrinsic dimension in a high dimensional space; and an energy function, to refine the probability mass on the learned support. Both the energy function and base jointly constitute the final model, unlike GANs, which retain only the base distribution (the "generator"). Furthermore, unlike classical energy-based models, the GEBM energy is defined even when the support of the model and data do not overlap. Samples from the trained model can be obtained via Langevin diffusion-based methods (MALA, UAL, HMC). Empirically, the GEBM samples on image-generation tasks are of better quality than those from the learned generator alone, indicating that all else being equal, the GEBM will outperform a GAN of the same complexity.
Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit, and director of the Centre for Computational Statistics and Machine Learning (CSML) at UCL. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the MPI for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University.
Arthur's recent research interests in machine learning include the design and training of generative models, both implicit (e.g. GANs) and explicit (exponential family and energy-based models), causal modeling, and nonparametric hypothesis testing.
He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for JMLR since April 2013, a Senior Area Chair for NeurIPS in 2018 and 2021, and a member of Royal Statistical Society Research Section Committee since January 2020. Arthur was a program chair for AISTATS in 2016, a tutorials chair for ICML 2018, a workshops chair for ICML 2019, a program chair for the Dali workshop in 2019, and an organsier of the Machine Learning Summer School 2019 in London.