Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands.
Bibliographical noteFunding Information:
This work was supported in part by the Ministry of Science and ICT (MSIT), South Korea, through the Information Technology Research Center (ITRC) Support Program supervised by the Institute of Information and Communications Technology Planning and Evaluation (IITP) under Grant IITP-2018-0-01405; in part by the IITP Grant funded by the Korea Government (MSIT) (A Neural-Symbolic Model for Knowledge Acquisition and Inference Techniques) under Grant 2020-0-00368; and in part by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education under Grant NRF-2021R1A6A1A03045425
© 2013 IEEE.
- Domain-specialized neural machine translation
- cross communication method
- deep learning
- neural machine translation
ASJC Scopus subject areas
- Materials Science(all)
- Electrical and Electronic Engineering
- Computer Science(all)