Index-> contents reference index search external Previous Next Home home-> resume publications pub_abstract white_paper packages software journals ssh_key fedora_package bikelist rock_climb picture mdot_bike_wish links pub_abstract-> newton_step tmb rkhs_bayes dismod_pde sskm l1_rks implicit_ad ckbs bayes_inverse mcmc_smooth semiblind neuron ConsumeO2 StochasticNonlinear SmoothVariance randompar markov sammii relative gps smoother prolate update nltrack network burg semi gamma match nsmooth error line_search sskm Headings-> Abstract

Learning using state space kernel machines

Abstract
Reconstruction of a function from noisy data is often formulated as a regularized optimization problem whose solution closely matches an observed data set and also has a small reproducing kernel Hilbert space norm. The loss functions that measure agreement between the data and the function are often smooth (e.g. the least squares penalty), but non-smooth loss functions are of interest in many applications. Using the least squares penalty, large machine learning problems with kernels amenable to a stochastic state space representation (which we call state space kernel machines) have been solved using a Kalman smoothing approach. In this paper we extend this approach for state space kernel machines to the Vapnik penalty, a particularly important non-smooth penalty that is robust to outlier noise in the measurements and induces a sparse representation of the reconstructed function. We exploit the structure of such models using interior point methods that efficiently solve the functional recovery problem with the Vapnik penalty, and demonstrate the effectiveness of the method on a numerical experiment. The computational effort of our approach Scales linearly with the number of data points.  citation
Input File: sskm.omh