【报告摘要】In the conventional statistical framework, a major goal is to develop optimal statistical procedures based on the sample size and statistical model. However, in many contemporary applications, non-statistical concerns such as privacy, communication, and computational constraints associated with the statistical procedures become crucial. This raises a fundamental question in data science: how can we make optimal statistical inference under these non-statistical constraints? In this talk, we explore recent advances in differentially private learning and distributed learning under communication constraints in a few specific settings. Our results demonstrate novel and interesting phenomena and suggest directions for further investigation.