Index-> contents reference index search external Previous Next Home home-> links software journals packages publications resume pub_abstract white_paper ssh_key fedora_package bikelist rock_climb picture mdot_bike_wish pub_abstract-> newton_step tmb rkhs_bayes dismod_pde sskm l1_rks implicit_ad ckbs bayes_inverse mcmc_smooth semiblind neuron ConsumeO2 StochasticNonlinear SmoothVariance randompar markov sammii relative gps smoother prolate update nltrack network burg semi gamma match nsmooth error line_search newton_step Headings-> Abstract

Newton Step Methods for AD of an Objective Defined Using Implicit Functions

Abstract
We consider the problem of computing derivatives of an objective that is defined using implicit functions; i.e., implicit variables are computed by solving equations that are often nonlinear and solved by an iterative process. If one were to apply Algorithmic Differentiation directly, one would differentiate the iterative process. In this paper we present the Newton step methods for computing derivatives of the objective. These methods make it easy to take advantage of sparsity, forward mode, reverse mode, and other AD techniques. We prove that the partial Newton step method works if the number of steps is equal to the order of the derivatives. The full Newton step method obtains two derivatives order for each step except for the first step. There are alternative methods that avoid differentiating the iterative process; e.g., the method implemented in ADOL-C. An optimal control example demonstrates the advantage of the Newton step methods when computing both gradients and Hessians. We also discuss the Laplace approximation method for nonlinear mixed effects models as an example application.  citation
Input File: newton_step.omh