Vectorized Conditional Neural Fields for Computational Fluid Dynamics

Abstract

In recent years, machine learning models based on Transformer components have gained prominence in solving Partial Differential Equations (PDEs). The current approaches, however, lack several desirable properties of a neural PDE solver such as (i) generalization to PDE parameters not seen during training, (ii) spatial and temporal zero-shot super-resolution, (iii) continuous-temporal extrapolation, (iv) applicability to PDEs of different dimensionalities, and (v) expedited inference for extended rollouts. Motivated by these limitations, we propose a time-continuous transformer model that enables all of the above desirable properties of a neural PDE solver. Crucially, we fuse two classes of methods proposed in recent years: Neural Fields which learn the solution of the PDE, and Neural Operators which learn mappings between function spaces. We achieve this by conditioning the neural field on a current state of the physical system.

Publication
In Machine Learning for Fluid Dynamics Workshop organized by the European Research Community on Flow, Turbulence and Combustion (ERCOFTAC).
Jan Hagnberger
Jan Hagnberger
Student in Data Science & AI

My research interests include machine learning for science, deep learning, transformers and partial differential equations.