Variational optimization of informational privacy
|M. Kumar, D. Brunner, B. Moser, B. Freudenthaler. Variational optimization of informational privacy. pages 32-47, DOI 10.1007/978-3-030-59028-4_4, 9, 2020.|
|Buch||DEXA 2020: Database and Expert Systems Applications|
|Serie||Communications in Computer and Information Science|
The datasets containing sensitive information can’t be publicly shared as a privacy-risk posed by several types of attacks exists. The data perturbation approach uses a random noise adding mechanism to preserve privacy, however, results in distortion of useful data. There remains the challenge of studying and optimizing privacy-utility tradeoff especially in the case when statistical distributions of data are unknown. This study introduces a novel information theoretic framework for studying privacy-utility tradeoff suitable for multivariate data and for the cases with unknown statistical distributions. We consider an information theoretic approach of quantifying privacy-leakage by the mutual information between sensitive data and released data. At the core of privacy-preserving framework lies a variational Bayesian fuzzy model approximating the uncertain mapping between released noise added data and private data such that the model is employed for variational approximation of informational privacy. The suggested privacy-preserving framework consists of three components: 1) Optimal Noise Adding Mechanism; 2) Modeling of Uncertain Mapping Between Released Noise Added Data and Private Data; and 3) Variational Approximation of Information Privacy.