Speculative Backpropagation for CNN Parallel Training

Sangwoo Park, Taeweon Suh

    Research output: Contribution to journalArticlepeer-review

    14 Citations (Scopus)

    Abstract

    The parallel learning in neural networks can greatly shorten the training time. Its prior efforts were mostly limited to distributing inputs to multiple computing engines. It is because the gradient descent algorithm in the neural network training is inherently sequential. This paper proposes a novel CNN parallel training method for image recognition. It overcomes the sequential property of the gradient descent and enables the parallel training with the speculative backpropagation. We found that the Softmax and ReLU outcomes in the forward propagation for the same labels are likely to be very similar. This characteristic makes it possible to perform the forward and backward propagation simultaneously. We implemented the proposed parallel model with CNNs in both software and hardware, and evaluated its performance. The parallel training reduces the training time by 34% in CIFAR-100 without the loss of the prediction accuracy compared to the sequential training. In many cases, it even improves the accuracy.

    Original languageEnglish
    Article number9272337
    Pages (from-to)215365-215374
    Number of pages10
    JournalIEEE Access
    Volume8
    DOIs
    Publication statusPublished - 2020

    Bibliographical note

    Funding Information:
    This work was supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP) Grant funded by the Korean Government (MSIT) (Research on CPU vulnerability detection and validation) under Grant 2019-0-00533 and (Regional strategic Industry convergence security core talent training business) under Grant 2019-0-01343.

    Publisher Copyright:
    © 2013 IEEE.

    Keywords

    • Deep learning
    • FPGA
    • parallel training
    • speculative backpropagation
    • training accelerator

    ASJC Scopus subject areas

    • General Computer Science
    • General Materials Science
    • General Engineering

    Fingerprint

    Dive into the research topics of 'Speculative Backpropagation for CNN Parallel Training'. Together they form a unique fingerprint.

    Cite this