Policies for Efficient Data Replication in P2P Systems

2013 
This paper addresses the problem of maintaining replicated data in large scale P2P systems. Although this topic has been extensively studied in the literature, to maintain replicated data in this setting, in an efficient manner, still remains a significant challenge. This paper proposes novel policies to address this problem and evaluates their performance against different criteria, such as monitoring costs, data transfer costs, and load unbalance costs. We show that one of these new policies significantly outperforms previous work. Interestingly, this policy is based on a somehow counter-intuitive approach, that uses less reliable nodes to store the most accessed data items. The insights to derive this policy were obtained from an in depth analysis of existing solutions, that is also captured in the paper.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []