CAMEL: Cybersecurity, Autonomous system, and Machine learning Engineering Lab

Back to Projects

Diffusion Models

Related Publications


ICLR 2025
Greed is Good: Guided Generation from a Greedy Perspective (arXiv)
Zander Blasingame and Chen Liu
The Thirteenth International Conference on Learning Representations (ICLR 2025) Workshop on Frontiers in Probabilistic Inference: Learning meets Sampling (FPI), Singapore EXPO, April 24-28, 2025
Abstract: Training-free guided generation is a widely used and powerful technique that allows the end user to exert further control over the generative process of diffusion models. In this work, we explore the guided generation from the perspective of optimizing the solution trajectory of a neural differential equation in a greedy manner. We present such a strategy as a unifying view on training-free guidance by showing that the greedy strategy is a first-order discretization of end-to-end optimization techniques. We show that a greedy guidance strategy makes good decisions and compare it to a guidance strategy using the ideal gradients found via the continuous adjoint equations. We then show how other popular training-free guidance strategies can be viewed in a unified manner from this perspective.

ICLR 2025
A Reversible Solver for Diffusion SDEs (arXiv)
Zander Blasingame and Chen Liu
The Thirteenth International Conference on Learning Representations (ICLR 2025) Workshop on Deep Generative Model in Machine Learning: Theory (DeLTa), Singapore EXPO, April 24-28, 2025
Abstract: Diffusion models have quickly become the state-of-the-art for generation tasks across many different data modalities. An important ability of diffusion models is the ability to encode samples from the data distribution back into the sampling prior distribution. This is useful for performing alterations to real data samples along with guided generation via the continuous adjoint equations. We propose an algebraically reversible solver for diffusion SDEs that can exactly invert real data samples into the prior distribution.

NeurIPS 2024
AdjointDEIS: Efficient Gradients for Diffusion Modes (arXiv)
Zander Blasingame and Chen Liu
The Thirty-Eighth Annual Conference on Neural Information Processing Systems (NeurIPS 2024), Vancouver, Canada, December 10-15, 2024
Abstract: The optimization of the latents and parameters of diffusion models with respect to some differentiable metric defined on the output of the model is a challenging and complex problem. The sampling for diffusion models is done by solving either the probability flow ODE or diffusion SDE wherein a neural network approximates the score function allowing a numerical ODE/SDE solver to be used. However, naive backpropagation techniques are memory intensive, requiring the storage of all intermediate states, and face additional complexity in handling the injected noise from the diffusion term of the diffusion SDE. We propose a novel family of bespoke ODE solvers to the continuous adjoint equations for diffusion models, which we call AdjointDEIS. We exploit the unique construction of diffusion SDEs to further simplify the formulation of the continuous adjoint equations using exponential integrators. Moreover, we provide convergence order guarantees for our bespoke solvers. Significantly, we show that continuous adjoint equations for diffusion SDEs actually simplify to a simple ODE. Lastly, we demonstrate the effectiveness of AdjointDEIS for guided generation with an adversarial attack in the form of the face morphing problem. Our code will be released at https://github.com/zblasingame/AdjointDEIS.