正在加载图片...
·150· 智能系统学报 第16卷 4068-4074 [17]ZENG Biqing,YANG Heng,XU Ruyang,et al.LCF:a [7]XU Weidi,TAN Ying.Semi-supervised target-oriented local context focus mechanism for aspect-based senti- sentiment classification[J].Neurocomputing,2019,337: ment classification[J].Applied sciences,2019,9(16): 120-128 1-22. [8]PETERS M,NEUMANN M.IYYER M,et al.Deep con- [18]PONTIKI M,GALANIS D,PAVLOPOULOS J,et al.Se- textualized word representations[C]//Proceedings of the mEval-2014 task 4:aspect based sentiment analysis[C]// 2018 Conference of the North American Chapter of the As- Proceedings of the 8th International Workshop on Se- sociation for Computational Linguistics.New Orleans, mantic Evaluation (SemEval 2014).Dublin,Ireland,2014:27-35. Louisiana.2018:2227-2237. [19]DONG Li,WEI Furu,TAN Chuangi,et al.Adaptive re- [9]VASWANI A,SHAZEER N,PARMAR N,et al.Atten- cursive neural network for target-dependent twitter senti- tion is all you need[C]//Proceedings of the 31st Internation- ment classification[C]//Proceedings of the 52nd Annual al Conference on Neural Information Processing Systems. Meeting of the Association for Computational Linguistics. Los Angeles,USA,2017:6000-6010. Baltimore,Maryland,2014:49-54. [10]RADFORD A.NARASIMHAN K,SALIMANS T,et al. [20]KIRITCHENKO S,ZHU Xiaodan,CHERRY C,et al. Improving language understanding by generative p- NRC-Canada-2014:detecting aspects and sentiment in re-training[EB/OL].[2019-5-10].https://s3-us-west- customer reviews[Cl//Proceedings of the 8th Internation- 2.amazonaws.com/openai-assets/research-covers/lan- al Workshop on Semantic Evaluation.Dublin,Ireland, guage-unsupervised/language understanding paper.pdf. 2014:437-442 [11]DEVLIN J,CHANG M-W,LEE K,et al.BERT: [21]CHEN Peng,SUN Zhongqian,BING Lidong,et al.Re- pre-training of deep bidirectional transformers for lan- current attention network on memory for aspect senti- guage understanding[C]//Proceedings of the 2019 Con- ment analysis[Cl//Proceedings of the 2017 Conference on ference of the North American Chapter of the Associ- Empirical Methods in Natural Language Processing. ation for Computational Linguistics.Minneapolis,Min- Copenhagen,Denmark,2017:452-461. nesota,USA,2019:4171-4186. [22]XU H,LIU B,SHU L,et al.BERT post-training for re- [12]KARIMI A,ROSSIL,PRATI A.Adversarial training for view reading comprehension and aspect-based sentiment aspect-based sentiment analysis with BERT[EB/OL]. analysis[C]//Proceedings of the 2019 Conference of the [2019-5-10].htps:///arxiv.org/abs/2001.11316 North American Chapter of the Association for Computa- [13]SONG Youwei.WANG Jiahai,JIANG Tao,et al.Att-en- tional Linguistics.Minneapolis,Minnesota,USA,2019: tional encoder network for targeted sentiment classi-fica- 2324-2335. tion[EB/OL].[2019-5-10].https://arxiv.org/abs/1902. [23]GLOROT X,BENGIO Y.Understanding the difficulty of 09314. training deep feedforward neural networks[C]//Proceed- [14]JIANG Long,YU Mo,ZHOU Ming,et al.Target-depend- ings of the Thirteenth International Conference on Artifi- ent twitter sentiment classification[C]//Proceedings of the cial Intelligence and Statistics.Chia Laguna Resort,Italy, 49th Annual Meeting of the Association for Computation- 2010:249-256 al Linguistics:Human Language Technologies.Portland, [24]SRIVASTAVA N.HINTON G,KRIZHEVSKY A,et al. USA,2011:151-160 Dropout:a simple way to prevent neural networks from [15]CUI Y,CHEN Z,WEI S,et al.Attention-over-attention overfitting[J].The journal of machine learning research, neural networks for reading comprehension[C]/Proceed- 2014,15(1):1929-1958. ings of the 55th Annual Meeting of the Association for [25]KINGMA D P,BA J.Adam:a method for stochastic op- Computational Linguistics.Vancouver,Canada,2017: timization[C]//The 3rd International Conference for 593-602. Learning Representations,San Diego.http://arxiv.org/abs/ [16]PENNINGTON J,SOCHER R,MANNING C.Glove: 1412.6980 global vectors for word representation[C]//Proceedings of [26]LI Xin,BING Lidong,LI Piji,et al.A unified model for the 2014 Conference on Empirical Methods in Natural opinion target extraction and target sentiment predic- Language Processing (EMNLP).Doha,Qatar,2014: tion[J].Proceedings of the AAAl conference on artificial 1532-1543. intelligence,2019,33(1):6714-67214068–4074. XU Weidi, TAN Ying. Semi-supervised target-oriented sentiment classification[J]. Neurocomputing, 2019, 337: 120–128. [7] PETERS M, NEUMANN M, IYYER M, et al. Deep con￾textualized word representations[C] // Proceedings of the 2018 Conference of the North American Chapter of the As￾sociation for Computational Linguistics.New Orleans, Louisiana, 2018 : 2227–2237. [8] VASWANI A, SHAZEER N, PARMAR N, et al. Atten￾tion is all you need[C]//Proceedings of the 31st Internation￾al Conference on Neural Information Processing Systems. Los Angeles, USA, 2017: 6000–6010. [9] RADFORD A, NARASIMHAN K, SALIMANS T, et al. Improving language understanding by generative p￾re-training[EB/OL]. [2019-5-10]. https://s3-us-west- 2.amazonaws.com/openai-assets/research-covers/lan￾guage-unsupervised/language_understanding_paper.pdf. [10] DEVLIN J, CHANG M- W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for lan￾guage understanding[C] // Proceedings of the 2019 Con￾ference of the North American Chapter of the Associ￾ation for Computational Linguistics. Minneapolis, Min￾nesota, USA,2019 : 4171–4186. [11] KARIMI A, ROSSI L, PRATI A. Adversarial training for aspect-based sentiment analysis with BERT[EB/OL]. [2019-5-10]. https://arxiv.org/abs/2001.11316. [12] SONG Youwei, WANG Jiahai, JIANG Tao, et al. Att-en￾tional encoder network for targeted sentiment classi-fica￾tion[EB/OL]. [2019-5-10]. https://arxiv.org/abs/1902. 09314. [13] JIANG Long, YU Mo, ZHOU Ming, et al. Target-depend￾ent twitter sentiment classification[C]//Proceedings of the 49th Annual Meeting of the Association for Computation￾al Linguistics: Human Language Technologies. Portland, USA, 2011: 151–160. [14] CUI Y, CHEN Z, WEI S, et al. Attention-over-attention neural networks for reading comprehension[C] // Proceed￾ings of the 55th Annual Meeting of the Association for Computational Linguistics.Vancouver, Canada, 2017 : 593 – 602. [15] PENNINGTON J, SOCHER R, MANNING C. Glove: global vectors for word representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, Qatar, 2014: 1532–1543. [16] ZENG Biqing, YANG Heng, XU Ruyang, et al. LCF: a local context focus mechanism for aspect-based senti￾ment classification[J]. Applied sciences, 2019, 9(16): 1–22. [17] PONTIKI M, GALANIS D, PAVLOPOULOS J, et al. Se￾mEval-2014 task 4: aspect based sentiment analysis[C]// Proceedings of the 8th International Workshop on Se￾mantic Evaluation (SemEval 2014). Dublin, Ireland, 2014: 27–35. [18] DONG Li, WEI Furu, TAN Chuanqi, et al. Adaptive re￾cursive neural network for target-dependent twitter senti￾ment classification[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, Maryland, 2014: 49–54. [19] KIRITCHENKO S, ZHU Xiaodan, CHERRY C, et al. NRC-Canada-2014: detecting aspects and sentiment in customer reviews[C]//Proceedings of the 8th Internation￾al Workshop on Semantic Evaluation. Dublin, Ireland, 2014: 437–442. [20] CHEN Peng, SUN Zhongqian, BING Lidong, et al. Re￾current attention network on memory for aspect senti￾ment analysis[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, Denmark, 2017: 452–461. [21] XU H, LIU B, SHU L, et al. BERT post-training for re￾view reading comprehension and aspect-based sentiment analysis[C] // Proceedings of the 2019 Conference of the North American Chapter of the Association for Computa￾tional Linguistics. Minneapolis, Minnesota, USA, 2019 : 2324 – 2335. [22] GLOROT X, BENGIO Y. Understanding the difficulty of training deep feedforward neural networks[C]//Proceed￾ings of the Thirteenth International Conference on Artifi￾cial Intelligence and Statistics. Chia Laguna Resort, Italy, 2010: 249–256. [23] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J]. The journal of machine learning research, 2014, 15(1): 1929–1958. [24] KINGMA D P, BA J. Adam: a method for stochastic op￾timization[C] //The 3rd International Conference for Learning Representations, San Diego. http://arxiv.org/abs/ 1412.6980. [25] LI Xin, BING Lidong, LI Piji, et al. A unified model for opinion target extraction and target sentiment predic￾tion[J]. Proceedings of the AAAI conference on artificial intelligence, 2019, 33(1): 6714–6721. [26] ·150· 智 能 系 统 学 报 第 16 卷
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有