This study addresses challenges in sentiment analysis for low-resource educational contexts by proposing a framework that integrates Few-Shot Learning (FSL) with Transformer-based ensemble models and boosting techniques. Sentiment analysis of student feedback is crucial for improving teaching quality, yet traditional methods struggle with data scarcity and computational inefficiency. The proposed framework leverages self-attention mechanisms in Transformers and combines models through Gradient Boosting to enhance performance and generalization with minimal labeled data. Evaluated on the UIT-VSFC dataset, comprising Vietnamese student feedback, the framework achieved superior F1-scores in sentiment and topic classification tasks, outperforming individual models. Results demonstrate their potential for extracting actionable insights to enhance educational experiences. Despite its effectiveness, the approach faces limitations such as reliance on pre-trained models and computational complexity. Future work could optimize lightweight models and explore applications in other domains like healthcare and finance.
Key words: Few-Shot Learning, Boosting, Transformer Models, Sentiment Analysis
|