Measures of Serial Data Compressibility by Neural Network Predictors

James P. Coughlin, R. H. Baran, Hanseok Ko

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A time series or univariate random process is compressible if it is predictable. Experiments with a variety of processes readily show that adaptive neural networks are at least as effective as their linear counterparts in one-step-ahead prediction. We explore the relationship between the predictive accuracy attained by the network, in the long run, and the closeness with which it can fit (and overfit) small segments of the same series in the course of many passes through the same data. Our findings suggest that the predictability of a process can be estimated by measuring the ease with which its increments can be overfitted.

Original languageEnglish
Title of host publicationProceedings - 1992 International Joint Conference on Neural Networks, IJCNN 1992
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages755-760
Number of pages6
ISBN (Electronic)0780305590
DOIs
Publication statusPublished - 1992
Externally publishedYes
Event1992 International Joint Conference on Neural Networks, IJCNN 1992 - Baltimore, United States
Duration: 1992 Jun 71992 Jun 11

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume1

Conference

Conference1992 International Joint Conference on Neural Networks, IJCNN 1992
Country/TerritoryUnited States
CityBaltimore
Period92/6/792/6/11

Bibliographical note

Publisher Copyright:
© 1992 IEEE.

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Measures of Serial Data Compressibility by Neural Network Predictors'. Together they form a unique fingerprint.

Cite this