Page 76 - 《软件学报》2021年第10期
P. 76

3048                                 Journal of Software  软件学报 Vol.32, No.10, October 2021

                 6    总结与未来工作

                    本文首次将序列到序列预训练任务引入到 AMR 文本生成任务中,提出了 3 种简单、有效的预训练任务和
                 两种微调方法.实验结果表明:本文的方法能够有效地提升 AMR 文本生成的性能,并在两份 AMR 数据集中均达
                 到了目前最优结果,其中,当使用质量较高的自动标注语料时,实验中在 AMR2.0 上的最高结果达到了 42.22.
                    在使用预训练模型后,本文模型在有较多节点的 AMR 图中依旧可以达到较高的性能;但在重入节点数量
                 较大时,本文模型还有很大的提升空间.在未来的工作中,将尝试解决线性化 AMR 导致重入节点结构信息丢失
                 的问题.


                 References:
                 [1]    Banarescu L, Bonial C, Cai S, et al. Abstract meaning representation for sembanking. In: Proc. of the 7th Linguistic Annotation
                     Workshop and Interoperability with Discourse. 2013. 178186.
                 [2]    Tamchyna  A,  Quirk  C,  Galley  M. A discriminative  model for semantics-to-string translation. In: Proc. of the  1st  Workshop on
                     Semantics-driven Statistical Machine Translation. 2015. 3036.
                 [3]    Mitra A, Baral C. Addressing a question answering challenge by combining statistical methods with inductive rule learning and
                     reasoning. In: Proc. of the 30th AAAI Conf. on Artificial Intelligence. 2016. 27792785.
                 [4]    Li X, Nguyen T H, Cao K, et al. Improving event detection with abstract meaning representation. In: Proc. of the 1st Workshop on
                     Computing News Storylines. 2015. 1115.
                 [5]    Xu K, Wu L, Wang Z, et al. Graph2seq: Graph to sequence learning with attention-based neural networks. arXiv Preprint arXiv:
                     1804.00823, 2018.
                 [6]    Flanigan J, Dyer C, Smith NA, et al. Generation from abstract meaning representation using tree transducers. In: Proc. of the 2016
                     Conf.  of the North American Chapter  of the Association  for Computational Linguistics: Human Language Technologies. 2016.
                     731739.
                 [7]    Song L,  Peng X, Zhang  Y,  et  al.  AMR-to-text generation  with  synchronous node replacement grammar. In: Proc. of the 55th
                     Annual Meeting of the Association for Computational Linguistics (Vol.2: Short Papers). 2017. 713.
                 [8]    Pourdamghani N, Knight K, Hermjakob U. Generating English from abstract meaning representations. In: Proc. of the 9th Int’l
                     Natural Language Generation Conf. 2016. 21.
                 [9]    Ferreira  TC,  Calixto I, Wubben S,  et  al. Linguistic realisation as machine  translation: Comparing  different MT models  for
                     AMR-to-text generation. In: Proc. of the 10th Int’l Conf. on Natural Language Generation. 2017. 110.
                [10]    Zhu J, Li J, Zhu M, et al. Modeling graph structure in transformer for better AMR-to-text generation. In: Proc. of the 2019 Conf. on
                     Empirical Methods in  Natural Language  Processing and the  9th  Int’l Joint Conf.  on Natural Language  Processing.  2019.
                     54625471.
                [11]    Peters M, Neumann  M,  Iyyer M,  et  al.  Deep  contextualized  word representations.  In: Proc. of  the 2018  Conf. of the North
                     American Chapter of the Association for Computational Linguistics: Human Language Technologies (Vol.1: Long Papers). 2018.
                     22272237.
                [12]    Radford A, Wu J, Child R, et al. Language models are unsupervised multitask learners. OpenAI Blog, 2019,1(8):9.
                [13]    Devlin J, Chang MW, Lee K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proc. of
                     the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
                     (Vol.1: Long and Short Papers). 2019. 41714186.
                [14]    Li Z, Hoiem D. Learning without forgetting. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2017,40(12):29352947.
                [15]    Konstas I, Iyer S, Yatskar M, et al. Neural AMR: Sequence-to-sequence models for parsing and generation. In: Proc. of the 55th
                     Annual Meeting of the Association for Computational Linguistics (Vol.1: Long Papers). 2017. 146157.
                [16]    Cao K, Clark S. Factorising AMR generation through syntax. In: Proc. of the 2019 Conf. of the North American Chapter of the
                     Association for Computational Linguistics: Human Language Technologies (Vol.1: Long and Short Papers). 2019. 21572163.
                [17]    Marcheggiani D, Perez-Beltrachini L. Deep graph convolutional encoders for structured data to text generation. In: Proc. of the
                     11th Int’l Conf. on Natural Language Generation. 2018. 19.
   71   72   73   74   75   76   77   78   79   80   81