$$Events$$

Jan. 23, 2018
12:00
-02:00

Room 202 in building 37

In the age of big data, machine learning algorithms have exciting and wide-ranging potential. However, as the data frequently contain sensitive personal information, there are real privacy concerns associated with the development and deployment of this technology. In this talk I will survey the landscape of differentially private learning, a line of work aimed at developing learning algorithms with mathematically proven privacy g

uarantees. As we will see, many learning tasks of interest are compatible with differential privacy.

We will also examine the reverse connection between privacy and learning, and show that, informally, differential privacy guarantees learning. In other words, learning is essentially the only thing that differentially private algorithms can do. This important connection allows the design of exciting new learning algorithms, even in settings which are not focused on privacy.