An advanced model integrating prompt tuning and dual-channel paradigm for enhancing public opinion sentiment classification
DOI number:
10.1016/j.compeleceng.2024.110047
Journal:
COMPUTERS & ELECTRICAL ENGINEERING
Abstract:
Sentiment analysis of online comments is crucial for governments in managing public opinion effectively. However, existing sentiment models face challenges in balancing memory efficiency with predictive accuracy. To address this, we propose PRTB-BERT, a hybrid model that combines prompt tuning with a dual-channel approach. PRTB-BERT employs a streamlined soft prompt template for efficient training with minimal parameter updates, leveraging BERT to generate word embeddings from input text. To enhance performance, we integrate advanced TextCNN and BiLSTM networks, capturing both local features and contextual semantic information. Additionally, we introduce a residual self-attention (RSA) mechanism in TextCNN to improve information extraction. Extensive testing on four comment datasets evaluates PRTB-BERT's classification performance, memory usage, and the comparison between soft and hard prompt templates. Results show that PRTB-BERT improves accuracy while reducing memory consumption, with the optimized soft prompt template outperforming traditional hard prompts in predictive performance.