Diffusion-Based Inverse Solver on Function Spaces With Applications to PDEs
Abbas Mammadov, Julius Berner, Kamyar Azizzadenesheli , and 2 more authors
Machine Learning and the Physical Sciences Workshop at NeurIPS, 2024
We present a novel framework for solving inverse problems in function spaces using diffusion-based generative models. Unlike traditional methods, which often discretize the domain and operate on fixed grids, our approach is discretization-agnostic, allowing for flexibility during sampling and generalization across different resolutions. Built upon function space diffusion models with neural operator architectures, we adapt the denoising process of pre-trained diffusion models to efficiently sample from posterior distributions in function spaces. This framework can be applied to a variety of problems, such as recovery of initial conditions and coefficient functions in noisy or partially observed PDE-based inverse problems like Darcy flows and Navier–Stokes equations. To the best of our knowledge, this is the first diffusion-based plug-and-play solver for inverse problems that operates in a discretization-agnostic manner, providing a new perspective on inverse problems with functional data, as typically arising in the context of PDEs.