Recent Posts

How to build E(n) Equivariant Normalizing Flows from our recent paper? We will discuss 1) Normalizing Flows 2) Continuous Time Normalizing Flows 3) E(n) GNNs, 4) Argmax Flows. Finally we talk about our 5) E(n) Flows. Most of these topics are tangential: if you don’t care, just read the intuition and skip it :)

Distillation without quality degradation.

We introduce three types of invertible convolutions: i) emerging convolutions for invertible zero-padded convolutions, ii) invertible periodic convolutions, and iii) stable and flexible 1 x 1 convolutions. convolutions

Publications

This preprint introduces Discrete Moment Matching Distillation for discrete diffusion models. The approach distills generators for text …

Unified Latents trains latent representations together with a diffusion prior and diffusion decoder. The method links reconstruction …

Simpler Diffusion shows that pixel-space diffusion can compete with latent diffusion at high resolution. A compact recipe for loss …

This work distills many-step diffusion models into few-step samplers by matching conditional expectations along the denoising …

Rolling Diffusion Models target temporal generation by progressively increasing uncertainty over future frames. A sliding-window …

This paper shows that high-resolution image diffusion can be trained directly in pixel space with a simple recipe: adapt the noise …

Blurring Diffusion Models connect heat-dissipation based generative processes to Gaussian diffusion with non-isotropic noise. This view …

This work introduces an E(3)-equivariant diffusion model for 3D molecule generation. The model denoises continuous atom coordinates and …

Autoregressive Diffusion Models unify order-agnostic autoregressive models and absorbing discrete diffusion under one simple …

This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs). To construct …

Generative flows and diffusion models have been predominantly trained on ordinal data, for example natural images. This paper …

This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations …

Efficient gradient computation of the Jacobian determinant term is a core problem of the normalizing flow framework. Thus, most …

This paper introduces the Variational Determinant Estimator (VDE), a variational extension of the recently proposed determinant …

This paper introduces a new method to build linear flows, by taking the exponential of a linear transformation. This linear …

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. …

Media is generally stored digitally and is therefore discrete. Many successful deep distribution models in deep learning learn a …

Autoregressive models (ARMs) currently hold state-of-the-art performance in likelihood-based modeling of image and audio data. …

Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimensional correlations and high …

Lossless compression methods shorten the expected representation size of data without loss of information, using a statistical model. …

Generative flows are attractive because they admit exact likelihood optimization and efficient image synthesis. Recently, Kingma & …

The effectiveness of Convolutional Neural Networks stems in large part from their ability to exploit the translation invariance that is …