Covariance shrinkage for autocorrelated data

Daniel Bartz, Klaus Robert Müller

Research output: Contribution to journalConference articlepeer-review

15 Citations (Scopus)


The accurate estimation of covariance matrices is essential for many signal processing and machine learning algorithms. In high dimensional settings the sample covariance is known to perform poorly, hence regularization strategies such as analytic shrinkage of Ledoit/Wolf are applied. In the standard setting, i.i.d. data is assumed, however, in practice, time series typically exhibit strong autocorrelation structure, which introduces a pronounced estimation bias. Recent work by Sancetta has extended the shrinkage framework beyond i.i.d. data. We contribute in this work by showing that the Sancetta estimator, while being consistent in the high-dimensional limit, suffers from a high bias in finite sample sizes. We propose an alternative estimator, which is (1) unbiased, (2) less sensitive to hyperparameter choice and (3) yields superior performance in simulations on toy data and on a real world data set from an EEG-based Brain-Computer-Interfacing experiment.

Original languageEnglish
Pages (from-to)1592-1600
Number of pages9
JournalAdvances in Neural Information Processing Systems
Issue numberJanuary
Publication statusPublished - 2014
Event28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada
Duration: 2014 Dec 82014 Dec 13

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing


Dive into the research topics of 'Covariance shrinkage for autocorrelated data'. Together they form a unique fingerprint.

Cite this