# Towards a Characterization of Differentially Private Learning

Shay Moran (Princeton University)

A profound discovery in statistical learning, which is sometimes referred to as “The Fundamental Theorem of PAC Learning”, asserts that the following properties are equivalent for an hypothesis class H:

(i) H is PAC learnable,

(ii) H satisfies the Uniform Convergence property,

(iii) H has a finite VC dimension.

This result provides formal justifications to principles such as Empirical Risk Minimization. (which amounts to the equivalence between items (i) and (ii).) It also demonstrates basic concepts such as Overfitting.

We will present and discuss an analogues conjecture for private learning:

Conjecture. The following properties are equivalent for an hypothesis class H:

(i) H is privately PAC learnable,

(ii) H satisfies the Private Uniform Convergence property, and

(iii) H has a finite Littlestone dimension.

Private uniform convergence is closely related to the notions of Synthetic Data and Sanitization from the literature of differential privacy. It means that given a labelled sample S as an input, one can privately publish an output sample S’ such that every hypothesis in H has (roughly) the same loss w.r.t to both S and S’. The Littlestone dimension is a combinatorial parameter which can be seen as an adaptive version of the VC dimension.

Based on joint works with Noga Alon, Olivier Bousquet, Roi Livni, and Maryanthe Malliaris.