Multimodel Sentiment Analysis with Self-attention
2021
Sentiment analysis is an important technique to analyze and evaluate users’ opinions (e.g., feedbacks for online services). The widely use of online social media allows users to express their opinions in a more flexible way (e.g., from text to video). Multimodal sentiment analysis is therefore needed. Existing multimodal sentiment analysis solutions do not consider the coordination between different multimodal signals and connection between utterances. As a result, the contribution of nonverbal behavior to sentiment analysis is not well understood. To address this issue, we propose an attention framework based on multi-head self-attention to predict utterance-level sentiment concerning the context information. By fusing three modal representations with attention mechanism, our model can learn the contributing features amongst them. We evaluate our proposed approach using the CMU-MOSI dataset. The experimental results show that our model outperforms existing solutions.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
24
References
1
Citations
NaN
KQI