Layer-wise analysis of deep networks with Gaussian kernels

Grégoire Montavon, Mikio L. Braun, Klaus Robert Müller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

Deep networks can potentially express a learning problem more efficiently than local learning machines. While deep networks outperform local learning machines on some problems, it is still unclear how their nice representation emerges from their complex structure. We present an analysis based on Gaussian kernels that measures how the representation of the learning problem evolves layer after layer as the deep network builds higher-level abstract representations of the input. We use this analysis to show empirically that deep networks build progressively better representations of the learning problem and that the best representations are obtained when the deep network discriminates only in the last layers.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 23
Subtitle of host publication24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Publication statusPublished - 2010
Externally publishedYes
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: 2010 Dec 62010 Dec 9

Publication series

NameAdvances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

Other

Other24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Country/TerritoryCanada
CityVancouver, BC
Period10/12/610/12/9

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'Layer-wise analysis of deep networks with Gaussian kernels'. Together they form a unique fingerprint.

Cite this