Continual Learning With Speculative Backpropagation and Activation History

Sangwoo Park, Taeweon Suh

    Research output: Contribution to journalArticlepeer-review

    1 Citation (Scopus)

    Abstract

    Continual learning is gaining traction these days with the explosive emergence of deep learning applications. Continual learning suffers from a severe problem called catastrophic forgetting. It means that the trained model loses the previously learned information when training with new data. This paper proposes two novel ideas for mitigating catastrophic forgetting: Speculative Backpropagation (SB) and Activation History (AH). The SB enables performing backpropagation based on past knowledge. The AH enables isolating important weights for the previous task. We evaluated the performance of our scheme in terms of accuracy and training time. The experiment results show a 4.4% improvement in knowledge preservation and a 31% reduction in training time, compared to the state-of-the-arts (EWC and SI).

    Original languageEnglish
    Pages (from-to)38555-38564
    Number of pages10
    JournalIEEE Access
    Volume10
    DOIs
    Publication statusPublished - 2022

    Bibliographical note

    Publisher Copyright:
    © 2013 IEEE.

    Keywords

    • Continual learning
    • FPGA
    • activation history
    • catastrophic forgetting
    • lifelong learning
    • parallel training
    • speculative backpropagation
    • training accelerator

    ASJC Scopus subject areas

    • General Engineering
    • General Materials Science
    • General Computer Science

    Fingerprint

    Dive into the research topics of 'Continual Learning With Speculative Backpropagation and Activation History'. Together they form a unique fingerprint.

    Cite this