On Exponential Families and Expressive Power of Related Generative Models

par/by Guido Montúfar
Max Planck Institute for Mathematics in the Sciences

In this talk I will present results on exponential families which give information on the representational power of diverse generative models. Generative models are able to generate high order correlations among visible variables even when demanding a series of conditional independence conditions (important for efficient learning). The question is: how many hidden variables must they contain? Deep Belief Networks (DBNs) and Restricted Boltzmann Machines (RBMs) generate weighted sums of conditional distributions, which means they are special kinds of mixture models. I will discuss a scheme for decomposing probability distributions as mixtures of exponential families and from this derive some results for DBNs and RBMs.