2024 Google Summer of Code and SciML's Summer Fellowship!

Interested in what work is going on over the summer? Meet our top students who are describing their fellowship and GSoC projects!

Kirill Zubov: Physics-Informed neural operators (PINO)

The goal of the project is to expand NeuraLPDE.jl with Physics-Informed neural operator method. That is a method which trains a neural operator in front of a solution of parametric Partial Differential Equations(PDEs) using the physics of the equation itself without the necessity (but not excluding use) dataset.

The distinctive element of PINO is the way of learning operator maps between infinite-dimensional function spaces in spite of usual for Deep learning vector spaces.

On this project I would like to focus on two more widespread approaches: DeeONet and Fourie Neural operators.

Articles about the topic:

Sifan Wang "Learning the solution operator of parametric partial differential equations with physics-informed DeepOnets" https://arxiv.org/abs/2103.10974

Zongyi Li "Physics-Informed Neural Operator for Learning Partial Differential Equations" https://arxiv.org/abs/2111.03794

Zongyi Li Fourier Neural Operator for Parametric Partial Differential Equations https://arxiv.org/abs/2010.08895

About me: My name’s Kirill. Shortly about myself, I was born in Kazakhstan. When I was 3, My family and I emigrated to Russia. My childhood was between Russia and Kazakhstan. After graduation from school, I moved to Saint Petersburg, where I already got a Bachelor's and Master's in Saint Petersburg State University. Somewhere in between I was a visiting student at Harbin Institute of Technology in China. A little over two years, I have been living in Tbilisi, Georgia. Years ago I participated at SCiML, GSoC also, where NeuralPDE.jl was begun. Now it is my second try with the SciML summer fellowship. I hope it will be successful this time and will raise NeuralPDE.jl to a new level.

image

Paras Puneet Singh: Generalized Multistart Optimization Strategies Interface

The purpose of this project is to create a generalized implementation of and improve the various available ones to be compatible with all solvers supported by Optimization.jl.This is to be done through the EnsembleProblem interface leveraging the already available parallelization infrastructure. Global optimization problems are ubiquitous in various domains, ranging from engineering design to machine learning model tuning. Multistart optimization strategies provide a powerful approach to tackling these problems by running local solvers from multiple initial points in the parameter space. Along with the interface, I would also like to explore a few other multi-start optimization methods such as Ant Colony Optimization(ACO), MDGOP, etc.

About me: I am Paras Puneet Singh, currently in my second year of undergraduate studies at BITS Pilani Hyderabad Campus. My passion lies at the intersection of Computer Science and mathematics, with a growing interest in machine learning (ML) and its practical implementations. In my recent endeavors, I have delved into various projects including the development of an e-commerce website, a CRUD website, a landmark detection ML project, a haar cascade face classifier, and an API-based AI summarizer website. These projects have not only honed my skills but have also fueled my passion for learning. I am a fast and resolute learner, eager to explore the depths of computer science. My journey has led me to NumFOCUS, where I am thrilled to be joining a diverse and supportive team. Here, I will be contributing to an open-source project focused on optimization, leveraging my skills and enthusiasm to make meaningful contributions to the field.

image

Abinav Pratap Singh: Feature enhancements in NeuralPDE.jl and LuxNeuralOperators.jl

This project aims to add certain feature enhancements to NeuralPDE.jl, the first of which is adding the Deep Ritz method. Deep Ritz method is a deep learning algorithm to solve second-order PDEs in high dimensions by modifying the problem using variational methods. Training a PINN can be quite cumbersome and the next enhancement adds Neural Tangent Kernel (NTK) adaptive loss function to the package. This will also include modifying the way sampling and loss functions are built for quadrature in the package. The second half of the project will try to achieve feature parity of LuxNeuralOperators.jl with NeuralOperators.jl and will add DeepONets, Markov Neural Operators, and the continuous form of the neural operator to the package, add benchmark tests and start documentation/ tutorials for the package.

About me: I am Abhinav, second year PhD student in Earth and Atmospheric Sciences at Georgia Tech. I grew up in India and completed my Integrated MTech at IIT (ISM) Dhanbad in 2022. My primary research interests are catered to solving geophysical inverse problems and quantifying the associated uncertainties. For this purpose, I intend to use neural networks to get solutions of certain PDEs. I am developing a Julia package for my research and hope to push the boundaries by using neural networks and neural operators to reduce the inference times.

image

Théo Duez: Implementation of some numerical methods

The aim of my project is to implement new numerical methods in the DifferentialEquation.jl ecosystem, and in particular in the OrdinaryDiffEq.jl package. More specifically, my first objective is to implement the relaxation method to improve the convergence and/or stability of certain Runge-Kuttas methods for particular equations where a particular quantity is conserved over time. Then, I would like to move on to other methods such as Rosenbrock methods or W-methods.

About me: I am Théo, a mathematics student with also a background in physics and engineering at the Université Paris-Saclay, and I will be graduating in December. Last year, I did a gap year with two internships where I discovered the world of numerical simulations. During the second internship, I discovered Julia and open-source code. It was a great experience and I want to gain more experience and skills in open-source development in Julia, especially in numerical simulation, one of my interests in mathematics. That is why I am taking part in the SciML fellowship.

image

Kabir Jain: Implementation of Physics-Informed Point Networks for Simulation in Irregular Geometries

The goal of this project is to implement Physics-Informed Point Networks (PIPNs) in Julia to solve partial differential equations (PDEs) on irregular geometries. PIPNs combine deep learning (specifically PointNet architectures for processing point cloud data) with the governing physics equations to enable efficient simulations on complex geometric domains. I plan on solving this problem by creating a well-documented, optimized and thoroughly tested Julia package for PIPNs that can efficiently solve PDEs on complex geometries. The implementation will be integrated with relevant scientific computing and machine learning packages in Julia.

About Me: I am Kabir Jain, I am a student from Singapore. I love physics, and AI! So this project is a really great intersection between both of my passions. I'm looking forward to working with all of you!

image

Rahul Manavalan: FastSolvers.jl - A package for accelerated PDE evaluations

My project is about implementing fast PDE solvers and benchmarking them with the methods in MethodOfLines.jl. "Fast solvers" is a slightly over-used, abused term that needs disambiguation. In principle, fast-solvers constitute non-standard discretization schemes (non- FDM, FEM), solvers based on kernel methods and the like. They exploit structures that are specific to a PDE, and are by construction more parsimonious and therefore fast. The scope of my project can be found here : https://summerofcode.withgoogle.com/proposals/details/kiwo9LSO

About me: My name is Rahul Manavalan. I am a PhD student in Mathematical Statistics and Scientific Computing at Lund University.

image