You give us data and we give you back LaTeX for the differential equation system that generated the data. That may sound like the future, but the future is here. In this SciML ecosystem update I am pleased to announce that a lot of our data-driven modeling components are finally released with full documentation. Let’s dive right in!
DataDrivenDiffEq.jl: Dynamic Mode Decomposition and Sparse Identification of Models
DataDrivenDiffEq.jl has arrived, complete with documentation and a full set of examples. Thank Julius Martensen (@AlCap23) for really driving this effort. You can use this library to identify the sparse functional form of...
Computational scientific discovery is at an interesting juncture. While we have mechanistic models of lots of different scientific phenomena, and reams of data being generated from experiments - our computational capabilities are unable to keep up. Our problems are too large for realistic simulation. Our problems are multiscale and too stiff. Our problems require tedious work like calculating gradients and getting code to run on GPUs and supercomputers. Our next step forward is a combination of science and machine learning, which combines mechanistic models with data based reasoning, presented as a unified set of abstractions and a high performance implementation....
This release is the long-awaited DAE extravaganza! We are releasing fully-implicit DAE integrators written in pure Julia, and thus compatible with items things like GPUs and arbitrary precision. We have various DAE initialization schemes to allow for automatically finding consistent initial conditions, and have also upgraded our solvers to solve state and time dependent mass matrices. These results have also trickled over to DiffEqFlux, with the new neural ODE structs which support singular mass matrices (DAEs). Together this is a very comprehensive push into the DAE world.
DImplicitEuler and DBDF2: Fully Implicit DAE Solvers in Pure Julia
Yes, you saw...
After the release of the paper Universal Differential Equations for Scientific Machine Learning, we have had very good feedback and have seen plenty of new users joining the Julia differential equation ecosystem and utilizing the tools for scientific machine learning. A lot of our work in this last release focuses around these capability, mixing with GPU support and global sensitivity analysis to augment the normal local tools of SciML.
1,000 Stars for DifferentialEquations.jl!
Before the bigger updates, I wanted to announce that DifferentialEquations.jl surpassed the 1,000 star milestone in this round. This is very helpful for the community as an...
Cluster Multi-GPU Support in DiffEqGPU
The DiffEqGPU automated GPU parallelism tools now support multiple GPUs. The README now shows that one can do things like:
Setup processes with different CUDA devices
using Distributed addprocs(numgpus) import CUDAdrv, CUDAnative
let gpuworkers = asyncmap(collect(zip(workers(), CUDAdrv.devices()))) do (p, d) remotecall_wait(CUDAnative.device!, p, d) p end
to setup each individual process with a separate GPU, and then the standard usage of DiffEqGPU.jl:
function lorenz(du,u,p,t) @inbounds begin du = p(u-u) du = u(p-u) - u du = uu - pu end nothing end
u0 = Float32[1.0;0.0;0.0] tspan = (0.0f0,100.0f0) p = (10.0f0,28.0f0,8/3f0) prob = ODEProblem(lorenz,u0,tspan,p) prob_func...