State of the Field
Recent advancements in generative modeling are focusing on enhancing robustness and efficiency across various applications. New frameworks, such as Conditional Unbalanced Optimal Transport, are addressing the challenges posed by outliers in conditional settings, which is crucial for tasks like image generation where data quality can vary significantly. Meanwhile, work on Fourier transformers is revolutionizing the discovery of crystalline materials by enabling the generation of complex structures while respecting physical constraints, thereby streamlining material science research. Additionally, the exploration of Wasserstein gradient flows is refining generative models to mitigate issues like mode collapse, enhancing their stability and performance. The shift towards Riemannian optimization in tensor networks is also noteworthy, as it improves the efficiency of generative modeling by leveraging manifold constraints. Collectively, these developments signal a maturation of the field, with a clear trajectory towards more reliable and application-ready generative models across diverse domains.
Papers
1–6 of 6Conditional Unbalanced Optimal Transport Maps: An Outlier-Robust Framework for Conditional Generative Modeling
Conditional Optimal Transport (COT) problem aims to find a transport map between conditional source and target distributions while minimizing the transport cost. Recently, these transport maps have be...
Fourier Transformers for Latent Crystallographic Diffusion and Generative Modeling
The discovery of new crystalline materials calls for generative models that handle periodic boundary conditions, crystallographic symmetries, and physical constraints, while scaling to large and struc...
Gradient Flow Drifting: Generative Modeling via Wasserstein Gradient Flows of KDE-Approximated Divergences
We reveal a precise mathematical framework about a new family of generative models which we call Gradient Flow Drifting. With this framework, we prove an equivalence between the recently proposed Drif...
Efficient Generative Modeling with Unitary Matrix Product States Using Riemannian Optimization
Tensor networks, which are originally developed for characterizing complex quantum many-body systems, have recently emerged as a powerful framework for capturing high-dimensional probability distribut...
Generative Drifting is Secretly Score Matching: a Spectral and Variational Perspective
Generative Modeling via Drifting has recently achieved state-of-the-art one-step image generation through a kernel-based drift operator, yet the success is largely empirical and its theoretical founda...
On the Robustness of Langevin Dynamics to Score Function Error
We consider the robustness of score-based generative modeling to errors in the estimate of the score function. In particular, we show that Langevin dynamics is not robust to the L^2 errors (more gener...