Bayesian uncertainty quantification for data-driven equation learning
2021
Equation learning aims to infer differential equation models from data. While
a number of studies have shown that differential equation models can be
successfully identified when the data are sufficiently detailed and corrupted
with relatively small amounts of noise, the relationship between observation
noise and uncertainty in the learned differential equation models remains
unexplored. We demonstrate that for noisy data sets there exists great
variation in both the structure of the learned differential equation models as
well as the parameter values. We explore how to combine data sets to quantify
uncertainty in the learned models, and at the same time draw mechanistic
conclusions about the target differential equations. We generate noisy data
using a stochastic agent-based model and combine equation learning methods with
approximate Bayesian computation (ABC) to show that the correct differential
equation model can be successfully learned from data, while a quantification of
uncertainty is given by a posterior distribution in parameter space.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
44
References
0
Citations
NaN
KQI