Affordable Access

Access to the full text

A smoothed monotonic regression via L2 regularization

Authors
  • Sysoev, Oleg1
  • Burdakov, Oleg2
  • 1 Linköping University, Department of Computer and Information Science, Linköping, 58183, Sweden , Linköping (Sweden)
  • 2 Linköping University, Department of Mathematics, Linköping, 58183, Sweden , Linköping (Sweden)
Type
Published Article
Journal
Knowledge and Information Systems
Publisher
Springer-Verlag
Publication Date
Apr 26, 2018
Volume
59
Issue
1
Pages
197–218
Identifiers
DOI: 10.1007/s10115-018-1201-2
Source
Springer Nature
Keywords
License
Green

Abstract

Monotonic regression is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term. In order to achieve a low computational complexity and at the same time to provide a high predictive power of the method, we introduce a probabilistically motivated approach for selecting the regularization parameters. In addition, we present a technique for correcting inconsistencies on the boundary. We show that the complexity of the proposed method is O(n2)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O(n^2)$$\end{document}. Our simulations demonstrate that when the data are large and the expected response is a complicated function (which is typical in machine learning applications) or when there is a change point in the response, the proposed method has a higher predictive power than many of the existing methods.

Report this publication

Statistics

Seen <100 times