CMU-ML-19-104
Machine Learning Department
School of Computer Science, Carnegie Mellon University



CMU-ML-19-104

Nonparametric Methods with Total Variation Type Regularization

Veeranjaneyulu Sadhanala

May 2019

Ph.D. Thesis

CMU-ML-19-104.pdf


Keywords: Nonparametric regression, total variation, higher order total variation, image denoising, additive models, trend filtering, lattice graphs, hypothesis testing, Kolmogorov Smirnov


We consider two special cases of the classical nonparametric regression problem of estimating a function ƒ : ℝd ⇾ ℝ given n noisy observations at inputs x1, ⋯, xn ∈ ℝd.

In the first case, the input points correspond to nodes on a d-dimensional regular lattice (having equal side lengths n1/d ). We define two new higher-order TV classes with signals which are allowed to exhibit different amounts of smoothness at different regions in the lattice, and derive lower bounds on their minimax estimation errors. We analyze two naturally associated estimators and show that they are rate optimal over the appropriate classes. Linear smoothers are shown to be minimax suboptimal over these TV classes.

In the second case, we consider additive models built with kth order trend filters resulting in kth degree piecewise polynomial components. We derive fast error rates for additive trend filtering and prove that these rates are minimax optimal when the underlying function is itself additive and has component functions whose derivatives have bounded kth order TV. We show that such rates are unattainable by additive models built from linear smoothers.

We also study an extension of the Kolmogorov-Smirnov (KS) two-sample test, which can be more sensitive to differences in the tails. Our test statistic is an integral probability metric (IPM) defined over a higher-order total variation ball, recovering the original KS test as its simplest case.

191 pages

Thesis Committee:
Ryan Tibshirani (Chair)
Aarti Singh
Dejan Slepčev
Larry Wasserman
James Shaepnack (University of California at Davis)

Roni Rosenfeld, Head, Machine Learning Department
Tom M. Mitchell, Interim Dean, School of Computer Science


SCS Technical Report Collection
School of Computer Science