Transformer-based multivariate time series anomaly detection using inter-variable attention mechanism

Hyeongwon Kang, Pilsung Kang

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

The primary objective of multivariate time-series anomaly detection is to spot deviations from regular patterns in time-series data compiled concurrently from various sensors and systems. This method finds application across diverse industries, aiding in system maintenance tasks. Capturing temporal dependencies and correlations between variables simultaneously is challenging due to the interconnectedness and mutual influence among variables in multivariate time-series. In this paper, we propose a unique method, the Variable Temporal Transformer (VTT), which utilizes the self-attention mechanism of transformers to effectively understand the temporal dependencies and relationships among variables. This proposed model performs anomaly detection by employing temporal self-attention to model temporal dependencies and variable self-attention to model variable correlations. We use a recently introduced evaluation metric after identifying potential overestimations in the performance of traditional time series anomaly detection methods using the point adjustment protocol evaluation metric. We confirm that our proposed method demonstrates cutting-edge performance through this new metric. Furthermore, we bring forth an anomaly interpretation module to shed light on anomalous data, which we verify using both synthetic and real-world industrial data.

Original languageEnglish
Article number111507
JournalKnowledge-Based Systems
Volume290
DOIs
Publication statusPublished - 2024 Apr 22

Bibliographical note

Publisher Copyright:
© 2024 Elsevier B.V.

Keywords

  • Anomaly detection
  • Attention mechanism
  • Multivariate time-series
  • Transformer
  • XAI

ASJC Scopus subject areas

  • Software
  • Management Information Systems
  • Information Systems and Management
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Transformer-based multivariate time series anomaly detection using inter-variable attention mechanism'. Together they form a unique fingerprint.

Cite this