An algebraic generalization for graph and tensor-based neural networks

2017 
Despite significant effort, there is currently no formal or de facto standard framework or format for constructing, representing, or manipulating general neural networks. In computational neuroscience, there have been some attempts to formalize connectionist notations and generative operations for neural networks, including Connection Set Algebra, but none are truly formal or general. In computational intelligence (CI), though the use of linear algebra and tensor-based models are widespread, graph-based frameworks are also popular and there is a lack of tools supporting the transfer of information between systems. To address these gaps, we exploited existing results about the connection between linear and relation algebras to define a concise, formal algebraic framework that generalizes graph and tensor-based neural networks. For simplicity and compatibility, this framework is purposefully defined as a minimal extension to linear algebra. We demonstrate the merits of this approach first by defining new operations for network composition along with proofs of their most important properties. An implementation of the algebraic framework is presented and applied to create an instance of an artificial neural network that is compatible with both graph and tensor based CI frameworks. The result is an algebraic framework for neural networks that generalizes the formats used in at least two systems, together with an example implementation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    2
    Citations
    NaN
    KQI
    []