skip to content

A new relaxation of differential privacy - A part of Women in Data Science (WiDS)

Presented by: 
Christine O'Keefe
Wednesday 7th December 2016 - 14:15 to 15:00
INI Seminar Room 1
Co-author: Anne-Sophie Charest 

Agencies and organisations around the world are increasingly seeking to realise the value embodied in their growing data holdings, including by making data available for research and policy analysis. On the other hand, access to data must be provided in a way that protects the privacy of individuals represented in the data. In order to achieve a justifiable trade-off between these competing objectives, appropriate measures of privacy protection and data usefulness are needed.  

In recent years, the formal differential privacy condition has emerged as a verifiable privacy protection standard. While differential privacy has had a marked impact on theory and literature, it has had far less impact in practice. Some concerns include the possibility that the differential privacy standard is so strong that statistical outputs are altered to the point where they are no longer useful. Various relaxations have been proposed to increase the utility of outputs, although none has yet achieved widespread adoption. In this paper we describe a new relaxation of the differential privacy condition, and demonstrate some of its properties. 
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons