Recurrent Factorization Machine with Self-Attention for Time-aware Service Recommendation

2020 
With the emergence of a flood of Web services which makes it difficult for users to find suitable services from a series of similar services. How to provide users with personalized services that utilize sequential historical records has become the key issue in service recommendation. In recent years, Factorization Machine(FM) and long Short Term Memory(LSTM) has been proposed for sequential service recommendation. However, FM ignored dynamic long-term dependencies that change over time between users and services. In addition, LSTM has the problem of vanishing gradient and exploding gradient simultaneously. To address this challenges,we proposed a novel model that combined Gated Recurrent Unit with Self-Attention (SAGRU) and Projected Factorization Machine (PFM), namely Recurrent Factorization Machine (RFM) .The new model comes with three innovations: 1) Self-Attention is considered in our model through assigning different weights for service records for extracting an interpretable user-service-time tensor embedding. 2) A bottleneck structure is designed in downstream candidate output for compressing the encoded information and suppressing the noise. 3) Through comprehensive experiments on real-world dataset, it is proved that this method is significantly better than other state-of-the-art time-aware methods in service recommendation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    0
    Citations
    NaN
    KQI
    []