Interactive Inference under Information Constraints

Clement Canonne

We consider distributed inference using sequentially interactive protocols, and obtain lower bounds on the minimax sample complexity of interactive protocols under local information constraints, a broad family of resource constraints which captures communication constraints, local differential privacy, and noisy binary queries as special cases. Focusing on the inference tasks of learning (density estimation) and identity testing (goodness-of-fit) for discrete distributions, we establish general lower bounds that take into account the local constraints modeled as a channel family.

Our main technical contribution is an approach to handle the correlation that builds due to interactivity and quantifies how effectively one can exploit this correlation in spite of the local constraints. As a corollary, we show that interactivity does not help for learning or testing under these either communication or local privacy constraints. Finally, we provide the first instance of a natural family of local constraints under which interactive protocols strictly outperform the noninteractive ones for testing.

Joint work with Jayadev Acharya, Yuhan Liu, Ziteng Sun, and Himanshu Tyagi.