We investigate the use of models from the theory of regularity structures as features in machine learning tasks. A model is a polynomial function of a space-time signal designed to well-approximate solutions to partial differential equations (PDEs), even in low regularity regimes. Models can be seen as natural multi-dimensional generalisations of signatures of paths; our work therefore aims to extend the recent use of signatures in data science beyond the context of time-ordered data. We provide a flexible definition of a model feature vector associated to a space-time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression. We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data. Our experiments include semi-linear parabolic and wave equations with forcing, and Burgers' equation with no forcing. We find an advantage in favour of our algorithms when compared to several alternative methods. Additionally, in the experiment with Burgers' equation, we noticed stability in the prediction power when noise is added to the observations.
Abstract We consider the directed mean curvature flow on the plane in a weak Gaussian random environment. We prove that, when started from a sufficiently flat initial condition, a rescaled and recentred solution converges to the Cole–Hopf solution of the KPZ equation. This result follows from the analysis of a more general system of nonlinear SPDEs driven by inhomogeneous noises, using the theory of regularity structures. However, due to inhomogeneity of the noise, the “black box” result developed in the series of works cannot be applied directly and requires significant extension to infinite‐dimensional regularity structures. Analysis of this general system of SPDEs gives two more interesting results. First, we prove that the solution of the quenched KPZ equation with a very strong force also converges to the Cole–Hopf solution of the KPZ equation. Second, we show that a properly rescaled and renormalised quenched Edwards–Wilkinson model in any dimension converges to the stochastic heat equation.
We investigate existence, uniqueness and regularity for local solutions of rough parabolic equations with subcritical noise of the form $du_t- L_tu_tdt= N(u_t)dt + \sum_{i = 1}^dF_i(u_t)d\mathbf X^i_t$ where $(L_t)_{t\in[0,T]}$ is a time-dependent family of unbounded operators acting on some scale of Banach spaces, while $\mathbf X\equiv(X,\mathbb X)$ is a two-step (non-necessarily geometric) rough path of H\"older regularity $\gamma >1/3.$ Besides dealing with non-autonomous evolution equations, our results also allow for unbounded operations in the noise term (up to some critical loss of regularity depending on that of the rough path $\mathbf X$). As a technical tool, we introduce a version of the multiplicative sewing lemma, which allows to construct the so-called product integrals in infinite dimensions. We later use it to construct a semigroup analogue for the non-autonomous linear PDEs as well as show how to deduce the semigroup version of the usual sewing lemma from it.
We investigate existence, uniqueness and regularity for local solutions of rough parabolic equations with subcritical noise of the form $du_t- L_tu_tdt= N(u_t)dt + \sum_{i = 1}^dF_i(u_t)d\mathbf X^i_t$ where $(L_t)_{t\in[0,T]}$ is a time-dependent family of unbounded operators acting on some scale of Banach spaces, while $\mathbf X\equiv(X,\mathbb X)$ is a two-step (non-necessarily geometric) rough path of Holder regularity $\gamma >1/3.$ Besides dealing with non-autonomous evolution equations, our results also allow for unbounded operations in the noise term (up to some critical loss of regularity depending on that of the rough path $\mathbf X$). As a technical tool, we introduce a version of the multiplicative sewing lemma, which allows to construct the so-called product integrals in infinite dimensions. We later use it to construct a semigroup analogue for the non-autonomous linear PDEs as well as show how to deduce the semigroup version of the usual sewing lemma from it.
We investigate the use of models from the theory of regularity structures as features in machine learning tasks. A model is a polynomial function of a space-time signal designed to well-approximate solutions to partial differential equations (PDEs), even in low regularity regimes. Models can be seen as natural multi-dimensional generalisations of signatures of paths; our work therefore aims to extend the recent use of signatures in data science beyond the context of time-ordered data. We provide a flexible definition of a model feature vector associated to a space-time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression. We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data. Our experiments include semi-linear parabolic and wave equations with forcing, and Burgers' equation with no forcing. We find an advantage in favour of our algorithms when compared to several alternative methods. Additionally, in the experiment with Burgers' equation, we find non-trivial predictive power when noise is added to the observations.
We consider a broad class of semilinear SPDEs with multiplicative noise driven by a finite-dimensional Wiener process. We show that, provided that an infinite-dimensional analogue of Hormander's bracket condition holds, the Malliavin of the solution is an operator with dense range. In particular, we show that the laws of finite-dimensional projections of such solutions admit smooth densities with respect to Lebesgue measure. The main idea is to develop a robust pathwise solution theory for such SPDEs using rough paths theory, which then allows us to use a pathwise version of Norris's lemma to work directly on the Malliavin matrix, instead of the reduced Malliavin matrix which is not available in this context.
On our way of proving this result, we develop some new tools for the theory of rough paths like a rough Fubini theorem and a deterministic mild Ito formula for rough PDEs.
We consider the directed mean curvature flow on the plane in a weak Gaussian random environment. We prove that, when started from a sufficiently flat initial condition, a rescaled and recentred solution converges to the Cole-Hopf solution of the KPZ equation. This result follows from the analysis of a more general system of nonlinear SPDEs driven by inhomogeneous noises, using the theory of regularity structures. However, due to inhomogeneity of the noise, the "black box" result developed in the series of works [Hai14, BHZ19, CH16, BCCH21] cannot be applied directly and requires significant extension to infinite-dimensional regularity structures. Analysis of this general system of SPDEs gives two more interesting results. First, we prove that the solution of the quenched KPZ equation with a very strong force also converges to the Cole-Hopf solution of the KPZ equation. Second, we show that a properly rescaled and renormalised quenched Edwards-Wilkinson model in any dimension converges to the stochastic heat equation.