language-icon Old Web
English
Sign In

Hessian automatic differentiation

In applied mathematics, Hessian automatic differentiation are techniques based on automatic differentiation (AD)that calculate the second derivative of an n {displaystyle n} -dimensional function, known as the Hessian matrix.For a given u ∈ R n {displaystyle uin mathbb {R} ^{n}}  , this method efficiently calculates the Hessian-vector product H ( x ) u {displaystyle H(x)u}  . Thus can be used to calculate the entire Hessian by calculating H ( x ) e i {displaystyle H(x)e_{i}}  , for i = 1 , … , n {displaystyle i=1,ldots ,n}  .An algorithm that calculates the entire Hessian with one forward and one reverse sweep of the computational graph is Edge_Pushing. Edge_Pushing is the result of applying the reverse gradient to the computational graph of the gradient. Naturally, this graph has n output nodes, thus in a sense one has to apply the reverse gradient method to each outgoing node. Edge_Pushing does this by taking into account overlapping calculations.The graph colouring techniques explore sparsity patterns of the Hessian matrix and cheap Hessian vector products to obtain the entire matrix. Thus these techniques are suited for large, sparse matrices. The general strategy of any such colouring technique is as follows.

[ "Hessian equation", "Quasi-Newton method" ]
Parent Topic
Child Topic
    No Parent Topic