logo
    Decomposing the parameter space of biological networks via a numerical discriminant approach
    5
    Citation
    24
    Reference
    20
    Related Paper
    Citation Trend
    Abstract:
    Many systems in biology, physics and engineering can be described by systems of ordinary differential equation containing many parameters. When studying the dynamic behavior of these large, nonlinear systems, it is useful to identify and characterize the steady-state solutions as the model parameters vary, a technically challenging problem in a high-dimensional parameter landscape. Rather than simply determining the number and stability of steady-states at distinct points in parameter space, we decompose the parameter space into finitely many regions, the steady-state solutions being consistent within each distinct region. From a computational algebraic viewpoint, the boundary of these regions is contained in the discriminant locus. We develop global and local numerical algorithms for constructing the discriminant locus and classifying the parameter landscape. We showcase our numerical approaches by applying them to molecular and cell-network models.
    We explore the derivation of distributed parameter system evolution laws (and in particular, partial differential operators and associated partial differential equations, PDEs) from spatiotemporal data. This is, of course, a classical identification problem; our focus here is on the use of manifold learning techniques (and, in particular, variations of Diffusion Maps) in conjunction with neural network learning algorithms that allow us to attempt this task when the dependent variables, and even the independent variables of the PDE are not known a priori and must be themselves derived from the data. The similarity measure used in Diffusion Maps for dependent coarse variable detection involves distances between local particle distribution observations; for independent variable detection we use distances between local short-time dynamics. We demonstrate each approach through an illustrative established PDE example. Such variable-free, emergent space identification algorithms connect naturally with equation-free multiscale computation tools.
    Diffusion map
    Distributed parameter system
    Identification
    Similarity (geometry)
    Manifold (fluid mechanics)
    In this paper, we consider a parabolic problem with time-dependent heterogeneous coefficients. Many applied problems have coupled space and time heterogeneities. Their homogenization or upscaling requires cell problems that are formulated in space-time representative volumes for problems with scale separation. In problems without scale separation, local problems include multiple macroscopic variables and oversampled local problems, where these macroscopic parameters are computed. These approaches, called Non-local multi-continua, are proposed for problems with complex spatial heterogeneities in a number of previous papers. In this paper, we extend this approach for space-time heterogeneities, by identifying macroscopic parameters in space-time regions. Our proposed method space-time Non-local multi-continua (space-time NLMC) is an efficient numerical solver to deal with time-dependent heterogeneous coefficients. It provides a flexible and systematic way to construct multiscale basis functions to approximate the solution. These multiscale basis functions are constructed by solving a local energy minimization problems in the oversampled space-time regions such that these multiscale basis functions decay exponentially outside the oversampled domain. Unlike the classical time-stepping methods combined with full-discretization technique, our space-time NLMC efficiently constructs the multiscale basis functions in a space-time domain and can provide a computational savings compared to space-only approaches as we discuss in the paper. We present two numerical experiments, which show that the proposed approach can provide a good accuracy.
    Basis function
    Basis (linear algebra)
    Solver
    Channelized
    Homogenization
    Citations (1)
    Insightful visualization of multidimensional scalar fields, in particular parameter spaces, is key to many fields in computational science and engineering. We propose a principal component-based approach to visualize such fields that accurately reflects their sensitivity to input parameters. The method performs dimensionality reduction on the vast $L^2$ Hilbert space formed by all possible partial functions (i.e., those defined by fixing one or more input parameters to specific values), which are projected to low-dimensional parameterized manifolds such as 3D curves, surfaces, and ensembles thereof. Our mapping provides a direct geometrical and visual interpretation in terms of Sobol's celebrated method for variance-based sensitivity analysis. We furthermore contribute a practical realization of the proposed method by means of tensor decomposition, which enables accurate yet interactive integration and multilinear principal component analysis of high-dimensional models.
    Sobol sequence
    Realization (probability)
    Citations (1)
    In this paper we present a systematic, data-driven approach to discovering bespoke coarse variables based on manifold learning algorithms. We illustrate this methodology with the classic Kuramoto phase oscillator model, and demonstrate how our manifold learning technique can successfully identify a coarse variable that is one-to-one with the established Kuramoto order parameter. We then introduce an extension of our coarse-graining methodology which enables us to learn evolution equations for the discovered coarse variables via an artificial neural network architecture templated on numerical time integrators (initial value solvers). This approach allows us to learn accurate approximations of time derivatives of state variables from sparse flow data, and hence discover useful approximate differential equation descriptions of their dynamic behavior. We demonstrate this capability by learning ODEs that agree with the known analytical expression for the Kuramoto order parameter dynamics at the continuum limit. We then show how this approach can also be used to learn the dynamics of coarse variables discovered through our manifold learning methodology. In both of these examples, we compare the results of our neural network based method to typical finite differences complemented with geometric harmonics. Finally, we present a series of computational examples illustrating how a variation of our manifold learning methodology can be used to discover sets of parameters, reduced parameter combinations, for multi-parameter models with complex coupling. We conclude with a discussion of possible extensions of this approach, including the possibility of obtaining data-driven effective partial differential equations for coarse-grained neuronal network behavior.
    Ode
    Kuramoto Model
    Manifold (fluid mechanics)
    Citations (0)
    In this work we introduce a manifold learning-based method for uncertainty quantification (UQ) in systems describing complex spatiotemporal processes. Our first objective is to identify the embedding of a set of high-dimensional data representing quantities of interest of the computational or analytical model. For this purpose, we employ Grassmannian diffusion maps, a two-step nonlinear dimension reduction technique which allows us to reduce the dimensionality of the data and identify meaningful geometric descriptions in a parsimonious and inexpensive manner. Polynomial chaos expansion is then used to construct a mapping between the stochastic input parameters and the diffusion coordinates of the reduced space. An adaptive clustering technique is proposed to identify an optimal number of clusters of points in the latent space. The similarity of points allows us to construct a number of geometric harmonic emulators which are finally utilized as a set of inexpensive pre-trained models to perform an inverse map of realizations of latent features to the ambient space and thus perform accurate out-of-sample predictions. Thus, the proposed method acts as an encoder-decoder system which is able to automatically handle very high-dimensional data while simultaneously operating successfully in the small-data regime. The method is demonstrated on two benchmark problems and on a system of advection-diffusion-reaction equations which model a first-order chemical reaction between two species. In all test cases, the proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
    Diffusion map
    Polynomial Chaos
    Uncertainty Quantification
    Data point
    Surrogate model
    Intrinsic dimension
    Manifold-learning techniques are routinely used in mining complex spatiotemporal data to extract useful, parsimonious data representations/parametrizations; these are, in turn, useful in nonlinear model identification tasks. We focus here on the case of time series data that can ultimately be modeled as a spatially distributed system [e.g., a partial differential equation (PDE)], but where we do not know the space in which this PDE should be formulated. Hence, even the spatial coordinates for the distributed system themselves need to be identified—to “emerge from”—the data mining process. We will first validate this “emergent space” reconstruction for time series sampled without space labels in known PDEs; this brings up the issue of observability of physical space from temporal observation data and the transition from spatially resolved to lumped (order-parameter-based) representations by tuning the scale of the data mining kernels. We will then present actual emergent space “discovery” illustrations. Our illustrative examples include chimera states (states of coexisting coherent and incoherent dynamics), and chaotic as well as quasiperiodic spatiotemporal dynamics, arising in partial differential equations and/or in heterogeneous networks. We also discuss how data-driven “spatial” coordinates can be extracted in ways invariant to the nature of the measuring instrument. Such gauge-invariant data mining can go beyond the fusion of heterogeneous observations of the same system, to the possible matching of apparently different systems.
    Manifold (fluid mechanics)
    Observability
    Citations (0)
    Learning governing equations allows for deeper understanding of the structure and dynamics of data. We present a random sampling method for learning structured dynamical systems from under-sampled and possibly noisy state-space measurements. The learning problem takes the form of a sparse least-squares fitting over a large set of candidate functions. Based on a Bernstein-like inequality for partly dependent random variables, we provide theoretical guarantees on the recovery rate of the sparse coefficients and the identification of the candidate functions for the corresponding problem. Computational results are demonstrated on datasets generated by the Lorenz 96 equation, the viscous Burgers' equation, and the two-component reaction-diffusion equations (which is challenging due to parameter sensitives in the model). This formulation has several advantages including ease of use, theoretical guarantees of success, and computational efficiency with respect to ambient dimension and number of candidate functions.
    Component (thermodynamics)
    Citations (7)