Sparse linear isotonic models

Sheng Chen, Arindam Banerjee

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

In machine learning and data mining, linear models have been widely used to model the response as parametric linear functions of the predictors. To relax such stringent assumptions made by parametric linear models, additive models consider the response to be a summation of unknown transformations applied on the predictors; in particular, additive isotonic models (AIMs) assume the unknown transformations to be monotone. In this paper, we introduce sparse linear isotonic models (SLIMs) for high-dimensional problems by hybridizing ideas in parametric sparse linear models and AIMs, which enjoy a few appealing advantages over both. In the high-dimensional setting, a two-step algorithm is proposed for estimating the sparse parameters as well as the monotone functions over predictors. Under mild statistical assumptions, we show that the algorithm can accurately estimate the parameters. Promising preliminary experiments are presented to support the theoretical results.

Original languageEnglish (US)
Pages1270-1279
Number of pages10
StatePublished - 2018
Event21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 - Playa Blanca, Lanzarote, Canary Islands, Spain
Duration: Apr 9 2018Apr 11 2018

Conference

Conference21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018
CountrySpain
CityPlaya Blanca, Lanzarote, Canary Islands
Period4/9/184/11/18

Bibliographical note

Funding Information:
The research was supported by NSF grants IIS-1563950, IIS-1447566, IIS-1447574, IIS-1422557, CCF-1451986, CNS-1314560, IIS-0953274, IIS-1029711, NASA grant NNX12AQ39A, and gifts from Adobe, IBM, and Yahoo.

Publisher Copyright:
Copyright 2018 by the author(s).

Fingerprint Dive into the research topics of 'Sparse linear isotonic models'. Together they form a unique fingerprint.

Cite this