[1] |
Pang B, Lee L. Opinion mining and sentiment analysis[M]. Hanover: Now Publishers Inc, 2008.
|
[2] |
Jiao J, Zhou Y. Sentiment polarity analysis based multi-dictionary[J]. Physics Procedia, 2011, 22: 590-596.
|
[3] |
Jurek A, Mulvenna M D, Bi Y. Improved lexicon-based sentiment analysis for social media analytics[J]. Security Informatics, 2015, 4(1): 9.
|
[4] |
Li F. The information content of forward-looking statements in corporate filings: a naÏve Bayesian machine learning approach[J]. Journal of Accounting Research, 2010, 48: 1049-1102.
|
[5] |
Hai Z, Cong G, Chang K, et al. Analyzing sentiments in one go: a supervised joint topic modeling approach[J]. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(6): 1172-1185.
|
[6] |
Singh J, Singh G, Singh R. Optimization of sentiment analysis using machine learning classifiers[J]. Human-centric Computing and Information Sciences, 2017, 7: 1-32.
|
[7] |
Al-Amrani Y, Lazaar M, El-Kadiri K E. Random forest and support vector machine based hybrid approach to sentiment analysis[J]. Procedia Computer Science, 2018, 127: 511-520.
|
[8] |
杨开漠, 吴明芬, 陈涛. 广义文本情感分析综述[J]. 计算机应用, 2019, 39(S2): 6-14.
|
[9] |
Man X, Luo T, Lin J. Financial sentiment analysis (FSA): a survey[C]// 2019 IEEE International Conference on Industrial Cyber Physical Systems (ICPS). 2019: 617-622.
|
[10] |
Devlin J, Chang M W, Lee K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2019: 4171-4186.
|
[11] |
Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 6000-6010.
|
[12] |
Matthew E P, Neumann M, Lyyer M, et al. Deep contextualized word representa tions[C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2018: 2227-2237.
|
[13] |
Alec R, Karthik N, Tim S, et al. Improving language understanding by generative pre-training[EB/OL]. [[2020-12-01].http://www.nlpir.org/wordpress/wp-content/uploads/2019/06/Improving-language-understanding-by-generative-pre-training.pdf.
|
[14] |
Radford A, Wu J, Child R, et al. Language models are unsupervised multitask learners[EB/OL]. [[2020-12-01].https://d4mucfpksywv.cloudfront.net/better-language-odels/language-models.pdf.
|
[15] |
Yang Z L, Dai Z H, Yang Y M, et al. XLNet: Generalized autoregressive pretraining for language understanding[C]// Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019: 5753-5763.
|
[16] |
Dai, Z H, Yang Z L, Yang Y M, et al. Transformer-XL: attentive language models beyond a fixed-length context[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 2978-2988.
|
[17] |
祝清麟, 梁斌, 徐睿峰, 等. 结合金融领域情感词典和注意力机制的细粒度情感分析[J]. 中文信息学报, 2022, 36(18): 109-117.
|