A study of BERT for context-aware neural machine translation
作者:Xueqing Wu, Yingce Xia, Jinhua Zhu, Lijun Wu, Shufang Xie, Tao Qin
摘要
Context-aware neural machine translation (NMT), which targets at translating sentences with contextual information, has attracted much attention recently. A key problem for context-aware NMT is to effectively encode and aggregate the contextual information. BERT (Devlin et al., in: NAACL, 2019) has been proven to be an effective feature extractor in natural language understanding tasks, but it has not been well studied in context-aware NMT. In this work, we conduct a study about leveraging BERT to encode the contextual information for NMT, and explore three commonly used methods to aggregate the contextual features. We conduct experiments on five translation tasks and find that concatenating all contextual sequences as a longer one and then encoding it by BERT obtains the best translation results. Specifically, we achieved state-of-the-art BLEU scores on several widely investigated tasks, including IWSLT’14 German\(\rightarrow\)English, News Commentary v11 English\(\rightarrow\)German translation and OpenSubtitle English\(\rightarrow\)Russian translation.
论文关键词:Neural machine translation, BERT, Context-aware translation
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10994-021-06070-y