Differential privacy is a promising approach to privacy-preserving data analysis. Differential privacy provides strong worst-case guarantees about the harm that a user could suffer from participating in a differentially private data analysis, but is also flexible enough to allow for a wide variety of data analyses to be performed with a high degree of utility. Having already been the subject of a decade of intense scientific study, it has also now been deployed in products at government agencies such as the U.S. Census Bureau and companies like Apple and Google.
Researchers in differential privacy span many distinct research communities, including algorithms, computer security, cryptography, databases, data mining, machine learning, statistics, programming languages, social sciences, and law. This workshop will bring researchers from these communities together to discuss recent developments in both the theory and practice of differential privacy.
The overall goal of TPDP is to stimulate the discussion on the relevance of differentially private data analyses in practice. For this reason, we seek contributions from different research areas of computer science and statistics.
Authors are invited to submit a short abstract (2-4 pages maximum) of their work. Abstracts must be written in English. Check back here soon for the submission deadline and submission server.
Submissions will undergo a lightweight review process and will be judged on originality, relevance, interest and clarity. Submission should describe novel works or works that have already appeared elsewhere but that can stimulate the discussion between different communities at the workshop. Accepted abstracts will be presented at the workshop either in technical sessions or as posters.
The workshop will not have formal proceedings and is not intended to preclude later publication at another venue.
Specific topics of interest for the workshop include (but are not limited to):