Privacy-Preserving Utility Verification of the Data Published by Non-Interactive Differentially Private Mechanisms

2016 
In the problem of privacy-preserving collaborative data publishing, a central data publisher is responsible for aggregating sensitive data from multiple parties and then anonymizing it before publishing for data mining. In such scenarios, the data users may have a strong demand to measure the utility of the published data, since most anonymization techniques have side effects on data utility. Nevertheless, this task is non-trivial, because the utility measuring usually requires the aggregated raw data, which is not revealed to the data users due to privacy concerns. Furthermore, the data publishers may even cheat in the raw data, since no one, including the individual providers, knows the full data set. In this paper, we first propose a privacy-preserving utility verification mechanism based upon cryptographic technique for DiffPart —a differentially private scheme designed for set-valued data. This proposal can measure the data utility based upon the encrypted frequencies of the aggregated raw data instead of the plain values, which thus prevents privacy breach. Moreover, it is enabled to privately check the correctness of the encrypted frequencies provided by the publisher, which helps detect dishonest publishers. We also extend this mechanism to DiffGen—another differentially private publishing scheme designed for relational data. Our theoretical and experimental evaluations demonstrate the security and efficiency of the proposed mechanism.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    40
    References
    15
    Citations
    NaN
    KQI
    []