GDP vs. LDP: A Survey from the Perspective of Information-Theoretic Channel
- PMID: 35327940
- PMCID: PMC8953244
- DOI: 10.3390/e24030430
GDP vs. LDP: A Survey from the Perspective of Information-Theoretic Channel
Abstract
The existing work has conducted in-depth research and analysis on global differential privacy (GDP) and local differential privacy (LDP) based on information theory. However, the data privacy preserving community does not systematically review and analyze GDP and LDP based on the information-theoretic channel model. To this end, we systematically reviewed GDP and LDP from the perspective of the information-theoretic channel in this survey. First, we presented the privacy threat model under information-theoretic channel. Second, we described and compared the information-theoretic channel models of GDP and LDP. Third, we summarized and analyzed definitions, privacy-utility metrics, properties, and mechanisms of GDP and LDP under their channel models. Finally, we discussed the open problems of GDP and LDP based on different types of information-theoretic channel models according to the above systematic review. Our main contribution provides a systematic survey of channel models, definitions, privacy-utility metrics, properties, and mechanisms for GDP and LDP from the perspective of information-theoretic channel and surveys the differential privacy synthetic data generation application using generative adversarial network and federated learning, respectively. Our work is helpful for systematically understanding the privacy threat model, definitions, privacy-utility metrics, properties, and mechanisms of GDP and LDP from the perspective of information-theoretic channel and promotes in-depth research and analysis of GDP and LDP based on different types of information-theoretic channel models.
Keywords: GDP vs. LDP; Rényi divergence; expected distortion; information-theoretic channel; mutual information.
Conflict of interest statement
The authors declare no conflict of interest.
Similar articles
-
A Comprehensive Survey on Local Differential Privacy toward Data Statistics and Analysis.Sensors (Basel). 2020 Dec 8;20(24):7030. doi: 10.3390/s20247030. Sensors (Basel). 2020. PMID: 33302517 Free PMC article. Review.
-
LDP-GAN : Generative adversarial networks with local differential privacy for patient medical records synthesis.Comput Biol Med. 2024 Jan;168:107738. doi: 10.1016/j.compbiomed.2023.107738. Epub 2023 Nov 19. Comput Biol Med. 2024. PMID: 37995536
-
Mechanisms for Robust Local Differential Privacy.Entropy (Basel). 2024 Mar 6;26(3):233. doi: 10.3390/e26030233. Entropy (Basel). 2024. PMID: 38539745 Free PMC article.
-
PLDP-FL: Federated Learning with Personalized Local Differential Privacy.Entropy (Basel). 2023 Mar 10;25(3):485. doi: 10.3390/e25030485. Entropy (Basel). 2023. PMID: 36981374 Free PMC article.
-
A systematic review of privacy-preserving methods deployed with blockchain and federated learning for the telemedicine.Healthc Anal (N Y). 2023 Nov;3:100192. doi: 10.1016/j.health.2023.100192. Epub 2023 May 5. Healthc Anal (N Y). 2023. PMID: 37223223 Free PMC article. Review.
Cited by
-
Privacy preservation for federated learning in health care.Patterns (N Y). 2024 Jul 12;5(7):100974. doi: 10.1016/j.patter.2024.100974. eCollection 2024 Jul 12. Patterns (N Y). 2024. PMID: 39081567 Free PMC article. Review.
References
-
- Dwork C., McSherry F., Nissim K., Smith A. Calibrating noise to sensitivity in private data analysis; Proceedings of the 3rd Theory of Cryptography Conference; New York, NY, USA. 4–7 March 2006; pp. 265–284.
-
- Kasiviswanathan S.P., Lee H.K., Nissim K., Raskhodnikova S., Smith A. What can we learn privately? SIAM J. Comput. 2011;40:793–826. doi: 10.1137/090756090. - DOI
-
- Liu H., Wu Z., Peng C., Tian F., Lu L. Bounded privacy-utility monotonicity indicating bounded tradeoff of differential privacy mechanisms. Theor. Comput. Sci. 2020;816:195–220. doi: 10.1016/j.tcs.2020.02.004. - DOI
-
- Dobrota B. Master Thesis. Utrecht University; Utrecht, The Netherlands: 2021. Measuring the Quantity of Data Privacy and Utility Tradeoff for Users’ Data: A Visualization Approach.
-
- Kairouz P., Oh S., Viswanath P. Extremal mechanisms for local differential privacy. J. Mach. Learn. Res. 2016;4:492–542.
Publication types
Grants and funding
- Grant 62002081, Grant 62062020, Grant U1836205, and Grant 61602290/National Natural Science Foundation of China
- Grant 2019M663907XB/Project Funded by China Postdoctoral Science Foundation
- Grant 20183001/Major Scientific and Technological Special Project of Guizhou Province
- Grant 2018BDKFJJ004/Foundation of Guizhou Provincial Key Laboratory of Public Big Data
LinkOut - more resources
Full Text Sources