Data compression and novelty filtering in retinotopic backpropagation networks

Hanseok Ko, R. H. Baran, Mohammed Arozullah

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Three-layer networks with identical numbers of input and output units were trained using standard backpropagation to reproduce vectors of independent, identically distributed random variables. Data compression was accomplished by virtue of the hidden layer having fewer units than the input or output. The trained nets gave a transparent response to training inputs and a translucent response when anomalies are added to elements of the training set. The reproducing vector closely resembles the unperturbed input. By subtracting the output vector from the input, a novel filter results, since the anomalies are dramatically enhanced in the difference vector.

Original languageEnglish
Title of host publication91 IEEE Int Jt Conf Neural Networks IJCNN 91
PublisherPubl by IEEE
Pages2502-2507
Number of pages6
ISBN (Print)0780302273, 9780780302273
DOIs
Publication statusPublished - 1991
Event1991 IEEE International Joint Conference on Neural Networks - IJCNN '91 - Singapore, Singapore
Duration: 1991 Nov 181991 Nov 21

Publication series

Name91 IEEE Int Jt Conf Neural Networks IJCNN 91

Other

Other1991 IEEE International Joint Conference on Neural Networks - IJCNN '91
CitySingapore, Singapore
Period91/11/1891/11/21

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint

Dive into the research topics of 'Data compression and novelty filtering in retinotopic backpropagation networks'. Together they form a unique fingerprint.

Cite this