Recent Posts

How to build E(n) Equivariant Normalizing Flows from our recent paper? We will discuss 1) Normalizing Flows 2) Continuous Time Normalizing Flows 3) E(n) GNNs, 4) Argmax Flows. Finally we talk about our 5) E(n) Flows. Most of these topics are tangential: if you don’t care, just read the intuition and skip it :)

We introduce three types of invertible convolutions: i) emerging convolutions for invertible zero-padded convolutions, ii) invertible periodic convolutions, and iii) stable and flexible 1 x 1 convolutions. convolutions

Publications

This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs). To construct …

Generative flows and diffusion models have been predominantly trained on ordinal data, for example natural images. This paper …

This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations …

Efficient gradient computation of the Jacobian determinant term is a core problem of the normalizing flow framework. Thus, most …

This paper introduces the Variational Determinant Estimator (VDE), a variational extension of the recently proposed determinant …

This paper introduces a new method to build linear flows, by taking the exponential of a linear transformation. This linear …

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. …

Media is generally stored digitally and is therefore discrete. Many successful deep distribution models in deep learning learn a …

Autoregressive models (ARMs) currently hold state-of-the-art performance in likelihood-based modeling of image and audio data. …

Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimensional correlations and high …

Lossless compression methods shorten the expected representation size of data without loss of information, using a statistical model. …

Generative flows are attractive because they admit exact likelihood optimization and efficient image synthesis. Recently, Kingma & …

The effectiveness of Convolutional Neural Networks stems in large part from their ability to exploit the translation invariance that is …