HRR: a data cleaning approach preserving local differential privacy

For the sensitive data generated by the sensor, we can use the noise to protect the privacy of these data. However, because of the complicated collection environment of the sensor data, it is easy to obtain some disorderly data, and the data need to be cleaned before use. In this work, we establish...

Full description

Bibliographic Details
Main Authors: Qilong Han, Qianqian Chen, Liguo Zhang, Kejia Zhang
Format: Article
Language:English
Published: Hindawi - SAGE Publishing 2018-12-01
Series:International Journal of Distributed Sensor Networks
Online Access:https://doi.org/10.1177/1550147718819938
Description
Summary:For the sensitive data generated by the sensor, we can use the noise to protect the privacy of these data. However, because of the complicated collection environment of the sensor data, it is easy to obtain some disorderly data, and the data need to be cleaned before use. In this work, we establish the differential privacy cleaning model H-RR, which is based on the contradiction generated by the function dependency, correct the contradictory data, and use the indistinguishability between the correction results to protect the data privacy. In this model, we add the local differential privacy mechanism in the process of data cleaning. While simplifying the data pre-processing process, we want to find a balance between data availability and security.
ISSN:1550-1477