Treffer: A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis.

Title:
A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis.
Authors:
Zou H; School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China; Department of Computer Science and Software Engineering, Concordia University, 2155 Guy Street, Montreal, H3H 2L9, Quebec, Canada. Electronic address: haochen.zou@mail.concordia.ca., Wang Y; School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: yongliwang@njust.edu.cn., Huang A; School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: anqihuang@njust.edu.cn.
Source:
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2026 Jan; Vol. 193, pp. 108012. Date of Electronic Publication: 2025 Aug 21.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Pergamon Press Country of Publication: United States NLM ID: 8805018 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1879-2782 (Electronic) Linking ISSN: 08936080 NLM ISO Abbreviation: Neural Netw Subsets: MEDLINE
Imprint Name(s):
Original Publication: New York : Pergamon Press, [c1988-
Contributed Indexing:
Keywords: Fine-grained sentiment analysis; Large language model; Natural language processing; Span-aware attention; Syntex-aware transformer
Entry Date(s):
Date Created: 20250828 Date Completed: 20251217 Latest Revision: 20251217
Update Code:
20251217
DOI:
10.1016/j.neunet.2025.108012
PMID:
40876298
Database:
MEDLINE

Weitere Informationen

Fine-grained aspect-based sentiment analysis requires language models to identify aspect entities and the corresponding sentiment information in the input text content. Transformer-based pre-trained large language models have demonstrated remarkable performance on various challenging natural language processing tasks. However, large language models face limitations in explicitly modelling syntactic relationships and effectively capturing local nuances between terms in the text content, which constrains their capability in fine-grained aspect-based sentiment analysis. We propose a novel span and syntax enhanced joint learning framework based on the latest large language model. The framework incorporates three key components, including the span-aware attention mechanism, the contextual Transformer, and the syntax-aware Transformer, which examine in parallel to generate span-aware features, contextual features, and syntax-aware features, respectively. The three dimensions of analyzed features are dynamically fused in the feature aggregation module, resulting in a combined feature for aspect entity recognition and sentiment classification. To the best of our knowledge, this study represents the pioneering effort to comprehensively leverage span-aware, contextual, and syntax-aware characteristics to augment large language models in addressing the fine-grained aspect-based sentiment analysis task. Experimental results on publicly available benchmark datasets validate the effectiveness of the architecture compared to state-of-the-art baseline competitors.
(Copyright © 2025 Elsevier Ltd. All rights reserved.)

Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.