Page 77 - 《软件学报》2021年第10期
P. 77
徐东钦 等:基于多任务预训练的 AMR 文本生成研究 3049
[18] Song L, Zhang Y, Wang Z, et al. A graph-to-sequence model for AMR-to-text generation. In: Proc. of the 56th Annual Meeting of
the Association for Computational Linguistics (Vol.1: Long Papers). 2018. 16161626.
[19] Beck D, Haffari G, Cohn T. Graph-to-sequence learning using gated graph neural networks. In: Proc. of the 56th Annual Meeting
of the Association for Computational Linguistics (Vol.1: Long Papers). 2018. 273283.
[20] Damonte M, Cohen SB. Structural neural encoders for AMR-to-text generation. In: Proc. of the 2019 Conf. of the North American
Chapter of the Association for Computational Linguistics: Human Language Technologies (Vol.1: Long and Short Papers). 2019:
36493658.
[21] Guo Z, Zhang Y, Teng Z, et al. Densely connected graph convolutional networks for graph-to-sequence learning. Trans. of the
Association for Computational Linguistics, 2019,7:297312.
[22] Ribeiro LFR, Gardent C, Gurevych I. Enhancing AMR-to-text generation with dual graph representations. In: Proc. of the 2019
Conf. on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. 2019.
31833194
[23] Cai D, Lam W. Graph transformer for graph-to-sequence learning. In: Proc. of the 34th AAAI Conf. on Artificial Intelligence. 2020.
74647471.
[24] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need. In: Proc. of the 31st Int’l Conf. on Neural Information Processing
Systems. 2017. 60006010.
[25] Zhao Y, Chen L, Chen Z, et al. Line graph enhanced AMR-to-text generation with mix-order graph attention networks. In: Proc. of
the 58th Annual Meeting of the Association for Computational Linguistics. 2020. 732741.
[26] Song L, Wang A, Su J, et al. Structural information preserving for graph-to-text generation. In: Proc. of the 58th Annual Meeting of
the Association for Computational Linguistics. 2020. 79877998.
[27] Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their compositionality. In: Proc. of the
26th Int’l Conf. on Neural Information Processing Systems, Vol.2. 2013. 31113119.
[28] Pennington J, Socher R, Manning CD. Glove: Global vectors for word representation. In: Proc. of the 2014 Conf. on Empirical
Methods in Natural Language Processing. 2014. 15321543.
[29] McCann B, Bradbury J, Xiong C, et al. Learned in translation: Contextualized word vectors. In: Proc. of the 31st Int’l Conf. on
Neural Information Processing Systems. 2017. 62976308.
[30] Wang L, Zhao W, Jia R, et al. Denoising based sequence-to-sequence pre-training for text generation. In: Proc. of the 2019 Conf.
on Empirical Methods in Natural Language Processing and the 9th Int’l Joint Conf. on Natural Language Processing. 2019.
39944006.
[31] Song K, Tan X, Qin T, et al. MASS: Masked sequence to sequence pre-training for language generation. In: Proc. of the Int’l Conf.
on Machine Learning. 2019. 59265936.
[32] Lewis M, Liu Y, Goyal N, et al. Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation,
and comprehension. In: Proc. of the 58th Annual Meeting of the Association for Computational Linguistics. 2020. 78717880.
[33] Mager M, Astudillo RF, Naseem T, et al. GPT-too: A language-model-first approach for AMR-to-Text generation. In: Proc. of the
58th Annual Meeting of the Association for Computational Linguistics. 2020. 18461852
[34] Harkous H, Groves I, Saffari A. Have your text and use it too! End-to-end neural data-to-text generation with semantic fidelity.
arXiv Preprint arXiv: 2004.06577, 2020.
[35] Sennrich R, Haddow B, Birch A. Neural machine translation of rare words with subword units. In: Proc. of the 54th Annual
Meeting of the Association for Computational Linguistics (Vol.1: Long Papers). 2016. 17151725.
[36] Hu J, Xia M, Neubig G, et al. Domain adaptation of neural machine translation by lexicon induction. In: Proc. of the 57th Annual
Meeting of the Association for Computational Linguistics. 2019. 29893001.
[37] Vincent P, Larochelle H, Lajoie I, et al. Stacked denoising autoencoders: Learning useful representations in a deep network with a
local denoising criterion. Journal of Machine Learning Research, 2010,11(12).
[38] Ge D, Li J, Zhu M, et al. Modeling source syntax and semantics for neural AMR parsing. In: Proc. of the 28th Int’l Joint Conf. on
Artificial Intelligence. AAAI, 2019. 49754981.