skip to content
 

Rothschild Lecture: The Promise of Differential Privacy

Presented by: 
Cynthia Dwork Harvard University
Date: 
Wednesday 30th November 2016 - 16:00 to 17:00
Venue: 
INI Seminar Room 1
Abstract: 
The rise of "Big Data" has been accompanied by an increase in the twin risks of spurious scientific discovery and privacy compromise.  A great deal of effort has been devoted to the former, from the use of sophisticated validation techniques, to deep statistical methods for controlling the false discovery rate in multiple hypothesis testing.  However, there is a fundamental disconnect between the theoretical results and the practice of data analysis: the theory of statistical inference assumes a fixed collection of hypotheses to be tested, selected non-adaptively before the data are gathered, whereas in practice data are shared and reused with hypotheses and new analyses being generated on the basis of data exploration and the outcomes of previous analyses. Privacy-preserving data analysis also has a large literature, spanning several disciplines. However, many attempts have proved problematic either in practice or on paper.   
"Differential privacy" – a recent notion tailored to situations in which data are plentiful – has provided a theoretically sound and powerful framework, giving rise to an explosion of research. We will review the definition of differential privacy, describe some basic algorithmic techniques for achieving it, and see that it also prevents false discoveries arising from adaptivity in data analysis.  
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons