Integrating Pre-Trained Language Model into Neural Machine Translation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Neural Machine Translation (NMT) has become a significant technology in natural language processing through extensive research and development. However, the deficiency of high-quality bilingual language pair data still poses a major challenge to improving NMT performance. Recent studies are exploring the use of contextual information from pre-trained language model (PLM) to address this problem. Yet, the issue of in-compatibility between PLM and NMT model remains unresolved. This study proposes a PLM-integrated NMT (PiNMT) model to overcome the identified problems. The PiNMT model consists of three critical components, PLM Multi Layer Converter, Embedding Fusion, and Cosine Alignment, each playing a vital role in providing effective PLM information to NMT. Furthermore, two training strategies, Separate Learning Rates and Dual Step Training, are also introduced in this paper. By implementing the proposed PiNMT model and training strategy, we achieved state-of-the-art performance on the IWSLT'14 En↔De dataset. This study's outcomes are noteworthy as they demonstrate a novel approach for efficiently integrating PLM with NMT to overcome incompatibility and enhance performance.

Original languageEnglish
Title of host publicationProceedings - 2023 2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages59-66
Number of pages8
ISBN (Electronic)9798350381474
DOIs
Publication statusPublished - 2023
Event2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023 - Xi�an, China
Duration: 2023 Nov 242023 Nov 26

Publication series

NameProceedings - 2023 2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023

Conference

Conference2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023
Country/TerritoryChina
CityXi�an
Period23/11/2423/11/26

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Keywords

  • BERT
  • Catastrophic Forgetting
  • Fine-tuning
  • Neural Machine Translation
  • Pre-trained Language Model
  • Transformer

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing
  • Information Systems and Management
  • Communication

Fingerprint

Dive into the research topics of 'Integrating Pre-Trained Language Model into Neural Machine Translation'. Together they form a unique fingerprint.

Cite this