Distant context aware text generation from abstract meaning representation

作者:Sen Yang, Dawei Feng, Yang Liu, Dongsheng Li

摘要

Text generation from abstract meaning representation is a fundamental task in natural language generation. An interesting challenge is that distant context could influence the surface realization for each node. In the previous encoder-decoder based approaches, graph neural networks have been commonly used to encode abstract meaning representation graphs and exhibited superior performance over the sequence and tree encoders. However, most of them cannot stack numerous layers, thus being too shallow to capture distant context. In this paper, we propose solutions from three aspects. Firstly, we introduce a Transformer based graph encoder to embed abstract meaning representation graphs. This encoder can stack more layers to encode larger context, while without performance degrading. Secondly, we expand the receptive field of each node, i.e. building direct connections between node pairs, to capture the information of its distant neighbors. We also exploit relative position embedding to make the model aware of the original hierarchy of graphs. Thirdly, we encode the linearized version of abstract meaning representation with the pre-trained language model to get the sequence encoding and incorporate it into graph encoding to enrich features. We conduct experiments on LDC2015E86 and LDC2017T10. Experimental results demonstrate that our method outperforms previous strong baselines. Especially, we investigate the performance of our model on large graphs, finding a larger performance gain. Our best model achieves 31.99 of BLEU and 37.02 of METEOR on LDC2015E86, 34.21 of BLEU, and 39.26 of METEOR on LDC2017T10, which are new states of the art.

论文关键词:Text generation, Abstract meaning representation, Graph encoder, Receptive field

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-021-02431-1