Improved recurrent generative adversarial networks with regularization techniques and a controllable framework

Minhyeok Lee, Donghyun Tae, Jae Hun Choi, Ho Youl Jung, Junhee Seok

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)


Generative Adversarial Network (GAN), a deep learning framework to generate synthetic but realistic samples, has produced astonishing results for image synthesis. However, because GAN is routinely used for image datasets, regularization methods for GAN have been developed for convolutional layers. In this study, to expand these methods for time-series data, which are one of the most common data types in various real datasets, modified regularization methods are proposed for Long Short-Term Memory (LSTM)-based GANs. Specifically, the spectral normalization, hinge loss, orthogonal regularization, and the truncation trick are modified and assessed for LSTM-based GANs. Furthermore, a conditional GAN architecture called Controllable GAN (ControlGAN) is applied to LSTM-based GANs to produce the desired samples. The evaluations are conducted with sine wave data, air pollution datasets, and a medical time-series dataset obtained from intensive care units. As a result, ControlGAN with the spectral normalization on gates and cell states consistently outperforms the others, including the conventional model, called Recurrent Conditional GAN (RCGAN).

Original languageEnglish
Pages (from-to)428-443
Number of pages16
JournalInformation Sciences
Publication statusPublished - 2020 Oct


  • Generative adversarial network
  • Long short-term memory
  • Recurrent neural network
  • Sample generation
  • Spectral normalization

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Theoretical Computer Science
  • Computer Science Applications
  • Information Systems and Management
  • Artificial Intelligence


Dive into the research topics of 'Improved recurrent generative adversarial networks with regularization techniques and a controllable framework'. Together they form a unique fingerprint.

Cite this