The Tensor Derivative of Vector Functions

2021 
Taking the first-order partial derivative of a vector-valued function results in the Jacobian matrix, which contains all partial derivatives of each entry. The matrix which contains all possible second-order partials of each entry, called the Hessian, is also well-studied. The first-order partial derivatives of a vector is a matrix, the next and higher-order partials constitute matrices with complicated structures. Among the different ways of handling this problem, there are some methods which use the tensor product of possible matrix valued functions and partials. Here we follow a very simple version in that line, namely we put partials into a column vector and apply it as consecutive tensor products on a vector-valued function. In this way we keep the results as vectors, and although the tensor product is not commutative, we can use linear operators to reach all permutations of the terms involved in the process. The main objective of this chapter is to show how simple and clear formulae can be derived if we use the method of tensor products for higher-order partial derivatives of vector-valued functions. Faa di Bruno’s formula will play an important role later on when we will be interested in the connections between moments and cumulants.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    0
    Citations
    NaN
    KQI
    []