How GNNs and Symmetries can help to solve PDEs
Deep learning has seen amazing advances over the past years, completely replacing traditional methods in fields such as speech recognition, natural language processing, image and video analysis and so on. A particularly versatile deep architecture that has gained much traction lately is the graph neural network (GNN), of which transformers represent a special case. GNNs have the desirable property that they can process graph structured data while respecting permutation symmetry. Recently, GNNs have found new applications in scientific computation, for instance to predict the properties of molecules or to predict the forces that act on atoms when they evolve (e.g. fold). In this application it is also key that geometric symmetries, such as translation and rotation symmetries are taken into consideration. Professor Max Welling will report on yet another exciting application of using GNNs to solve partial differential equations (PDEs). It turns out that GNNs are an excellent tool to develop neural PDE integrators. Moreover, PDEs are full of surprising symmetries that can be leveraged to train neural integrators with less data. Professor Max Welling will discuss this very exciting new chapter in deep learning. He will end with a discussion of whether reversely, PDEs can also serve as a model for new deep architectures.
Joint work with Johannes Brandstetter and Daniel Worrall.
Science Park 904, Room C1.110 or hybrid