Dimensionality reduction based on ICA for regression problems

Nojun Kwak, Chunghoon Kim, Hwangnam Kim

    Research output: Contribution to journalArticlepeer-review

    20 Citations (Scopus)

    Abstract

    In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA algorithms become available to a task of dimensionality reduction for regression problems by maximizing the joint mutual information between target variable and new attributes. We applied the proposed method to a couple of real world regression problems as well as some artificial problems and compared the performances with those of other conventional methods. Experimental results show that the proposed method can efficiently reduce the dimension of input space without degrading the regression performance.

    Original languageEnglish
    Pages (from-to)2596-2603
    Number of pages8
    JournalNeurocomputing
    Volume71
    Issue number13-15
    DOIs
    Publication statusPublished - 2008 Aug

    Bibliographical note

    Copyright:
    Copyright 2017 Elsevier B.V., All rights reserved.

    Keywords

    • Dimensionality reduction
    • Feature extraction
    • ICA
    • Regression

    ASJC Scopus subject areas

    • Computer Science Applications
    • Cognitive Neuroscience
    • Artificial Intelligence

    Fingerprint

    Dive into the research topics of 'Dimensionality reduction based on ICA for regression problems'. Together they form a unique fingerprint.

    Cite this