Does Syntactic Knowledge in Multilingual Language Models Transfer Across Languages

2018 
Recent work has shown that neural models can be successfully trained on multiple languages simultaneously. We investigate whether such models learn to share and exploit common syntactic knowledge among the languages on which they are trained. This extended abstract presents our preliminary results
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    15
    Citations
    NaN
    KQI
    []