Abstract
Neural Machine Translation (NMT) has become a significant technology in natural language processing through extensive research and development. However, the deficiency of high-quality bilingual language pair data still poses a major challenge to improving NMT performance. Recent studies are exploring the use of contextual information from pre-trained language model (PLM) to address this problem. Yet, the issue of in-compatibility between PLM and NMT model remains unresolved. This study proposes a PLM-integrated NMT (PiNMT) model to overcome the identified problems. The PiNMT model consists of three critical components, PLM Multi Layer Converter, Embedding Fusion, and Cosine Alignment, each playing a vital role in providing effective PLM information to NMT. Furthermore, two training strategies, Separate Learning Rates and Dual Step Training, are also introduced in this paper. By implementing the proposed PiNMT model and training strategy, we achieved state-of-the-art performance on the IWSLT'14 En↔De dataset. This study's outcomes are noteworthy as they demonstrate a novel approach for efficiently integrating PLM with NMT to overcome incompatibility and enhance performance.
Original language | English |
---|---|
Title of host publication | Proceedings - 2023 2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 59-66 |
Number of pages | 8 |
ISBN (Electronic) | 9798350381474 |
DOIs | |
Publication status | Published - 2023 |
Event | 2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023 - Xi�an, China Duration: 2023 Nov 24 → 2023 Nov 26 |
Publication series
Name | Proceedings - 2023 2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023 |
---|
Conference
Conference | 2nd International Conference on Frontiers of Communications, Information System and Data Science, CISDS 2023 |
---|---|
Country/Territory | China |
City | Xi�an |
Period | 23/11/24 → 23/11/26 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- BERT
- Catastrophic Forgetting
- Fine-tuning
- Neural Machine Translation
- Pre-trained Language Model
- Transformer
ASJC Scopus subject areas
- Computer Networks and Communications
- Information Systems
- Signal Processing
- Information Systems and Management
- Communication