Subspace information criterion for nonquadratic regularizers-model selection for sparse regressors

Koji Tsuda, Masashi Sugiyama, Klaus Robert Müller

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Nonquadratic regularizers, in particular the l1 norm regularizer can yield sparse solutions that generalize well. In this work we propose the generalized subspace information criterion (GSIC) that allows to predict the generalization error for this useful family of regularizers. We show that under some technical assumptions GSIC is an asymptotically unbiased estimator of the generalization error. GSIC is demonstrated to have a good performance in experiments with the l1 norm regularizer as we compare with the network information criterion (NIC) and cross- validation in relatively large sample cases. However in the small sample case, GSIC tends to fail to capture the optimal model due to its large variance. Therefore, also a biased version of GSIC is introduced, which achieves reliable model selection in the relevant and challenging scenario of high-dimensional data and few samples.

Original languageEnglish
Pages (from-to)70-80
Number of pages11
JournalIEEE Transactions on Neural Networks
Volume13
Issue number1
DOIs
Publication statusPublished - 2002 Jan

Bibliographical note

Funding Information:
Manuscript received November 8, 2000; revised June 27, 2001. The work of K.-R. Müller was supported in part by DFG under Contracts JA 379/91, MU 987/11 and the EU in the Neurocolt 2, and the BLISS Project (IST-1999-14190). K. Tsuda is with Fraunhofer FIRST, 12489 Berlin, Germany. He is also with AIST Computational Biology Research Center, Tokyo 135-0064, Japan (e-mail: [email protected]). M. Sugiyama is with the Tokyo Institute of Technology, Tokyo 152-8552, Japan (e-mail: [email protected]). K.-R. Müller is with Fraunhofer FIRST, 12489 Berlin, Germany. He is also with the University of Potsdam, 14469 Potsdam, Germany (e-mail: [email protected]). Publisher Item Identifier S 1045-9227(02)00353-3.

Keywords

  • Kernel methods
  • Model selection
  • Regularization
  • Sparse regressors
  • Subspace information criterion (SIC)

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Subspace information criterion for nonquadratic regularizers-model selection for sparse regressors'. Together they form a unique fingerprint.

Cite this