Title: Source Coding with Latent Variable Models
Abstract: Model-based entropy coders (e.g. Arithmetic Coding) require an estimate of the source probability mass function (PMF) to compress data. Better estimates of the PMF leads to better compression rates. Latent variable models (LVMs) have proven to be excellent PMF estimators, especially for high dimensional and complex sources such as images. Unfortunately, LVMs only provide an estimate of a joint PMF over the source and latent variables. This makes source coding with them non-trivial, as marginalizing over the latent variables is intractable in many cases. In this talk, we will learn how to design lossless coding schemes that use LVMs and don’t require marginalizing out the latent variables; through the use of bits-back coding.
Bio: Daniel is a first year M.A.Sc. student supervised by Prof. Ashish Khisti. He is also a Graduate Student Researcher at the Vector Institute, supervised by Alireza Makhzani; where he has been awarded the Vector Scholarship in AI. In 2015, he received a B.Sc. in Electronics Engineering, with honours, from the Federal University of Santa Catarina, Brazil; advised by Danilo Silva. In 2014, he spent a year abroad at UofT on a government scholarship, where he interned with Prof. Frank Kschischang over the summer. Prior to starting graduate school, he worked as a Machine Learning Engineer for 5 years, mostly in the healthcare sector. His research interests include information theory and machine learning. He will be interning at Facebook AI Research during the summer of 2021; hosted by Karen Ullrich.
More info at https://dsevero.com