Perspex: Flexible and Transparent Local Differential Privacy

Scientific Abstract

This proposal studies differential privacy, a rigorous quantified definition of privacy which guarantees a bound on the loss of privacy due to the release of statistical queries. Using both static and dynamic program analysis, a number of programming frameworks have been designed for the construction of differentially private analyses which are private-by-construction. These frameworks all assume that the sensitive data is collected centrally, and processed by a trusted curator. However, the main examples of differential privacy applied in practice – for example in Google Chrome’s collection of browsing statistics, or Apple’s training of predictive messaging – use a purely local mechanism, thus avoiding the collection of sensitive data altogether. While this is a benefit of the local approach, with systems like Apple’s, users are required to completely trust that the analysis running on their system has the claimed privacy properties.

We propose a framework for transparent local differential privacy in which analyses can be statically verified for their differential privacy properties. Our proposal builds on a novel method for abstracting the behaviour of probabilistic programs not previously used for static analysis. Beyond this, we will explore new flexible variants of local differential privacy that permit information declassification for improved accuracy.

Se den svenskversionen för den populärvetenskaplig beskrivning

Start date 01/01/2019
End date 31/12/2022

Published: Thu 04 Jul 2019.