Abstract
We present a novel differentiable grid-based representation for efficiently solving differential equations (DEs). Widely used architectures for neural solvers, such as sinusoidal neural networks, are coordinate-based MLPs that are, both, computationally intensive and slow to train. Although grid-based alternatives for implicit representations (e.g., Instant-NGP and K-Planes) offer faster training, their reliance on linear interpolation restricts their ability to compute higher-order derivatives, rendering them unsuitable for solving DEs. In contrast, our approach overcomes these limitations by combining the efficiency of feature grids with radial basis function interpolation, which is infinitely often differentiable. To effectively capture high-frequency solutions and enable stable and faster computation of global gradients, we introduce a multi-resolution decomposition with co-located grids. Our proposed representation, ∂∞-Grid, is trained implicitly using the differential equations as loss functions, enabling accurate modeling of physical fields. We validate ∂∞-Grid on a variety of tasks, including Poisson equation for image reconstruction, the Helmholtz equation for wave fields, and the Kirchhoff-Love boundary value problem for cloth simulation. Our results demonstrate a 5–20× speed-up over coordinate-based MLP-based methods, solving differential equations in seconds or minutes while maintaining comparable accuracy and compactness.
In a nutshell
Given differential equation $\mathcal{F}\left(\mathbf{x}, \mathbf{u}, \nabla _\mathbf{x} \mathbf{u}, \nabla _\mathbf{x} ^2 \mathbf{u}, \dots\right) = 0$, we solve for the field $\mathbf{u}: \Omega \to \mathbb{R}^m$ by representing it with multi-scale differentiable feature grids and minimising the equation residual as the loss.
Step 1: Sample and interpolate
Sample query points $\mathbf{x}$ from input domain $\Omega$ (here, 2D) and apply infinitely differentiable radial basis function (RBF) interpolation $\varphi(\mathbf{x},\mathbf{x}_i)$, to extract smooth features from multi-scale grids $\{\mathbf{F}_s\}_{s=0}^S$. Concatenate the per-scale features $\{\mathbf{f}_s(\mathbf{x})\}_s$ into $\mathbf{f}(\mathbf{x})$.
Step 2: Neighbourhood precomputation
RBF interpolation $\varphi(\mathbf{x},\mathbf{x}_i)$, with $i \in \mathcal{N}(\mathbf{x}) $ is globally supported, which is costly at high resolution. With stratified sampling, we fix a compact, distance-based neighborhood $\mathcal{N}_{\rho}(\mathbf{x})$ per query (set by the RBF shape parameter) and precompute it once before optimisation. This bounds memory and compute while preserving overlapping supports required for higher-order derivatives.
Step 3: Produce the signal
A decoder $\mathbf{d}(\mathbf{f}(\mathbf{x});\Theta)$ (linear layer or shallow MLP) maps features to the signal $\mathbf{u}(\mathbf{x})$, e.g., an image or deformation field. We enforce Dirichlet boundary or initial conditions as hard constraints by modulating the prediction with a boundary-aware weighting $\mathcal{B}(\mathbf{x})$.
Step 4: Optimise the solution
Use automatic differentiation to obtain the derivatives needed by $\mathcal{F}$, then optimize the feature grids $\{\mathbf{F}_s\}_s$ and decoder $\Theta$ with gradient-based training. The solution is the field $\mathbf{u}(\mathbf{x};\Theta, \{\mathbf{F}_s\}_s)$ that minimises the residual of $\mathcal{F}$.
BibTeX
@inproceedings{kairanda2026partialgrid,
title={$\partial^{\infty}$-Grid: A Neural Differential Equation Solver with Differentiable Feature Grids},
author={Kairanda, Navami and Naik, Shanthika and Habermann, Marc and Sharma, Avinash and Theobalt, Christian and Golyanik, Vladislav},
booktitle={International Conference on Learning Representations (ICLR)},
year={2026}
}