Lipschitz properties for deep convolutional networks

Radu Balan, Maneesh Singh, Dongmian Zou

Research output: Chapter in Book/Report/Conference proceedingChapter

3 Scopus citations
27 Downloads (Pure)

Abstract

In this paper we discuss the stability properties of convolutional neural networks. Convolutional neural networks are widely used in machine learning. In classification they are mainly used as feature extractors. Ideally, we expect similar features when the inputs are from the same class. That is, we hope to see a small change in the feature vector with respect to a deformation on the input signal. This can be established mathematically, and the key step is to derive the Lipschitz properties. Further, we establish that the stability results can be extended for more general networks. We give a formula for computing the Lipschitz bound, and compare it with other methods to show it is closer to the optimal value.

Original languageEnglish (US)
Title of host publicationContemporary Mathematics
PublisherAmerican Mathematical Society
Pages129-151
Number of pages23
Volume706
DOIs
StatePublished - Mar 1 2018
Externally publishedYes

Publication series

NameContemporary Mathematics

Bibliographical note

25 pages, 10 figures

Fingerprint Dive into the research topics of 'Lipschitz properties for deep convolutional networks'. Together they form a unique fingerprint.

Cite this