正在加载图片...
--Test1 77 。-Tgs12 76 76 75 141 7 73 0 10 15 20 0 1e7 1e6 (a)Sensitivity of B when 6 =le-6 (b)Sensitivity of 6 when B=5 Figure 2:Sensitivity analysis on insuranceQA. 83 83 -MAP 。-MAP 82.5 -MRR MRR 82 82 E81 80 80.5 80 79 5 10 15 20 1e7 1e6 1e5 0 (a)Sensitivity of B when 6=1e-6 (b)Sensitivity of 6 when B=5 Figure 3:Sensitivity analysis on wikiQA. HAS is flexible to integrate other encoders and question- Chen,Q.;Hu,Q.;Huang,J.X.;and He,L.2018.CA- answer interaction mechanisms.Furthermore,the idea to RNN:Using context-aligned recurrent neural networks for adopt hashing for binary representation learning in HAS can modeling sentence similarity.In Proceedings of the AAAl also be used for other NLP tasks.All these possible exten- Conference on Artificial Intelligence,265-273. sions will be pursued in our future work. Chung,Y.;Lee,H.;and Glass,J.R.2018.Supervised and unsupervised transfer learning for question answering.In Acknowledgments Proceedings of the Annual Conference of the North Ameri- This work is supported by the NSFC-NRF Joint Research can Chapter of the Association for Computational Linguis- Project(No.61861146001). tics:Human Language Technologies,1585-1594. Cui,H.;Sun,R.;Li,K.;Kan,M.;and Chua,T.2005. References Question answering passage retrieval using dependency re- Bahdanau,D.;Cho,K.;and Bengio,Y.2015.Neural ma- lations.In Proceedings of the Annual International ACM chine translation by jointly learning to align and translate. SIGIR Conference on Research and Development in Infor- In Proceedings of the International Conference on Learning mation Retrieval,400-407. Representations. Deng,Y.;Shen,Y.;Yang,M.;Li,Y.;Du,N.;Fan,W.;and Cao,Z.;Long,M.;Wang,J.;and Yu,P.S.2017.HashNet: Lei,K.2018.Knowledge as A bridge:Improving cross- Deep learning to hash by continuation.In Proceedings of the domain answer selection with external knowledge.In Pro- IEEE International Conference on Computer Vision,5609- ceedings of the International Conference on Computational 5618. Linguistics,3295-3305. Chen,Q.;Hu,Q.;Huang,J.X.;He,L.;and An,W.2017. Devlin,J.;Chang,M.;Lee,K.;and Toutanova,K.2018. Enhancing recurrent neural networks with positional atten- BERT:pre-training of deep bidirectional transformers for tion for question answering.In Proceedings of the Interna- language understanding.In Proceedings of the Annual Con- tional ACM SIGIR Conference on Research and Develop- ference of the North American Chapter of the Association ment in Information Retrieval,993-996. for Computational Linguistics.0 5 10 15 20 β 73 74 75 76 77 Precision@1 Test1 Test2 (a) Sensitivity of β when δ = 1e −6 0 1e-7 1e-6 1e-5 1e-4 δ 73 74 75 76 77 Precision@1 Test1 Test2 (b) Sensitivity of δ when β = 5 Figure 2: Sensitivity analysis on insuranceQA. 0 5 10 15 20 β 80 80.5 81 81.5 82 82.5 83 performance MAP MRR (a) Sensitivity of β when δ = 1e −6 0 1e-7 1e-6 1e-5 1e-4 δ 79 80 81 82 83 performance MAP MRR (b) Sensitivity of δ when β = 5 Figure 3: Sensitivity analysis on wikiQA. HAS is flexible to integrate other encoders and question￾answer interaction mechanisms. Furthermore, the idea to adopt hashing for binary representation learning in HAS can also be used for other NLP tasks. All these possible exten￾sions will be pursued in our future work. Acknowledgments This work is supported by the NSFC-NRF Joint Research Project (No. 61861146001). References Bahdanau, D.; Cho, K.; and Bengio, Y. 2015. Neural ma￾chine translation by jointly learning to align and translate. In Proceedings of the International Conference on Learning Representations. Cao, Z.; Long, M.; Wang, J.; and Yu, P. S. 2017. HashNet: Deep learning to hash by continuation. In Proceedings of the IEEE International Conference on Computer Vision, 5609– 5618. Chen, Q.; Hu, Q.; Huang, J. X.; He, L.; and An, W. 2017. Enhancing recurrent neural networks with positional atten￾tion for question answering. In Proceedings of the Interna￾tional ACM SIGIR Conference on Research and Develop￾ment in Information Retrieval, 993–996. Chen, Q.; Hu, Q.; Huang, J. X.; and He, L. 2018. CA￾RNN: Using context-aligned recurrent neural networks for modeling sentence similarity. In Proceedings of the AAAI Conference on Artificial Intelligence, 265–273. Chung, Y.; Lee, H.; and Glass, J. R. 2018. Supervised and unsupervised transfer learning for question answering. In Proceedings of the Annual Conference of the North Ameri￾can Chapter of the Association for Computational Linguis￾tics: Human Language Technologies, 1585–1594. Cui, H.; Sun, R.; Li, K.; Kan, M.; and Chua, T. 2005. Question answering passage retrieval using dependency re￾lations. In Proceedings of the Annual International ACM SIGIR Conference on Research and Development in Infor￾mation Retrieval, 400–407. Deng, Y.; Shen, Y.; Yang, M.; Li, Y.; Du, N.; Fan, W.; and Lei, K. 2018. Knowledge as A bridge: Improving cross￾domain answer selection with external knowledge. In Pro￾ceedings of the International Conference on Computational Linguistics, 3295–3305. Devlin, J.; Chang, M.; Lee, K.; and Toutanova, K. 2018. BERT: pre-training of deep bidirectional transformers for language understanding. In Proceedings of the Annual Con￾ference of the North American Chapter of the Association for Computational Linguistics
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有