Index-> contents reference index search external Previous Next Home home-> links software journals packages publications resume pub_abstract white_paper ssh_key fedora_package bikelist rock_climb picture pub_abstract-> newton_step tmb rkhs_bayes dismod_pde sskm l1_rks implicit_ad ckbs bayes_inverse mcmc_smooth semiblind neuron ConsumeO2 StochasticNonlinear SmoothVariance randompar markov sammii relative gps smoother prolate update nltrack network burg semi gamma match nsmooth error line_search nsmooth Headings-> Abstract

Nonsmooth optimization by successive quadratic programming

Abstract
The problem of minimizing h(f(x)) over x a real vector of dimension n, where f is continuously differentiable into the real space of dimension m, and h is convex, is currently an important topic in optimization. We develop an algorithm that solves this problem, and which has several features that distinguish is from other algorithms that are currently being used. It uses a subproblem at each iteration that is defined in terms of approximate, instead of exact derivatives of f. It only requires approximate solution of the subproblem at each iteration. It ensures descent by means of a quadratic term instead of trust regions or line searches.  A generalization of the problem is considered where the real space of dimension m is replaced by the space of continuous functions on a compact set and h is either convex or globally Lipschitz continuous. This leads to an implementable method for minimizing the kind of penalty functions that occur in semi-infinite programming.  citation
Input File: nsmooth.omh