主讲人:郑术蓉 教授(东北师范大学)
时间:2025年9月19日下午15:00-16:00
地点:数学院南楼N933
【报告摘要】Recently, the double descent phenomenon has been widely observed, where the generalization error first decreases, then increases, and surprisingly decreases again as model size grows. While a growing body of theoretical work has sought to explain this behavior, the underlying mechanism of this phenomenon remains unclear. Motivated by the double ascent behavior observed in high-dimensional hypothesis tests and linear regression, we establish a connection between the occurrence of double descent and the generalized inverse of the sample covariance matrix. Leveraging random matrix theory, we provide a mechanistic understanding of the double descent/ascent phenomenon through the convergence of the eigenvalues of the sample covariance matrix. Building on this insight, we propose a regularization method to eliminate the double ascent phenomenon so that the generalization error always decreases as model size grows, and demonstrate its effectiveness through simulation studies.
【报告人简介】郑术蓉,东北师范大学教授。主要研究方向为:大维随机矩阵理论及高维统计分析。曾在统计学顶级期刊 Annals of Statistics , Journal of the American Statistical Association, Biometrika 等上发表多篇与大维随机矩阵理论相关的学术论文。现任 Statistica Sinica, Journal of Business & Economic Statistics, Journal of Multivariate Analysis 等6个学术期刊编委。