应用与计算讨论班
报告题目:On the Approximation and Spectral Properties of Shallow ReLU Networks
报 告 人:毛通博士后研究员(阿卜杜拉国王科技大学)
时 间:2025年12月22日(星期一),上午10:00-12:00
地 点:海纳苑2幢101
摘 要:We develop a systematic theory for the approximation capabilities and spectral behavior of shallow ReLU neural networks. Our recent results on linearized neural networks, in which the inner parameters are fixed, show that such models can achieve approximation rates comparable to those of fully nonlinear shallow ReLU networks, provided the target function lies in a slightly smaller class—namely, a Sobolev space rather than the Barron space in nonlinear approximation. We further investigate the metric entropy of the classes, demonstrating that their expressive complexity is comparable to that of high-order Sobolev classes. This analysis indicates that neural networks do not, in fact, fundamentally overcome the curse of dimensionality.
We further establish a saturation phenomenon for shallow ReLU models and conduct a detailed spectral analysis of the associated mass and stiffness matrices, providing insight into conditioning and stability when these models are used as discretization schemes for partial differential equations. Numerical experiments validate the theoretical predictions and illustrate that linearized networks offer a competitive alternative to traditional finite element methods for PDE approximation.
报告人简介:Tong Mao is a Postdoctoral Researcher in King Abdullah University of Science and Technology (KAUST). He received his Ph.D. from the City University of Hong Kong. His research focuses on the mathematical theory of machine learning, including approximation theory of neural networks and spectral methods for function approximation. He has published multiple works on the approximation properties of ReLU and ReLUk networks, deep convolutional neural networks, concentrating on function spaces such as Korobov and Sobolev spaces.