Exploring Zero-shot Cross-lingual Aspect-based Sentiment Analysis using Pre-trained Multilingual Language Models

2021 
Aspect-based sentiment analysis (ABSA) has received much attention in the Natural Language Processing research community. Most of the proposed methods are conducted exclusively in English and high-resources languages. Leveraging resources available from English and transferring to low-resources languages seems to be an immediate solution. In this paper, we investigate the performance of zero-shot cross-lingual transfer learning based on pre-trained multilingual models (mBERT and XLM-R) for two main sub-tasks in the ABSA problem: Aspect Category Detection and Opinion Target Expression. We experiment on the benchmark data sets of six languages as English, Russian, Dutch, Spanish, Turkish, and French. The experimental results demonstrated that using the XLM-R model can yield relatively acceptable results for the zero-shot cross-lingual scenario.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []