Position : homepage  > 2019

Data Science| The Pruning Technique of Neural Networks Related to Compressed Sensing

2020-10-28 15:30:00

2020-10-28 15:30:00

2020-10-28 15:30:00

Speaker : 3:30PM,XIA Yu

Time : 2020-10-28 15:30:00

Location :

报告人:夏羽博士(杭州师范大学)

报告题目:The Pruning Technique of Neural Networks Related to Compressed Sensing

时间与地点:2019年10月28日下午3:30-5:30,玉泉校区教十一417
报告摘要:We discuss the Net-Trim, which is a technique simplifies a trained neural network. The method is a convex post-processing module, which prunes a trained network layer by layer, while preserving the internal responses. The initial and retrained models before and after Net-Trim application is consistent and the number of training samples needed to discover a network can be expressed using a certain number of nonzero terms. Specifically, if there is a set of weights that uses at most s terms that can re-create the layer outputs from the layer inputs, these weights can be found from O(slog N/s) samples, where N is the input size. The related theoretical results are similar to those for sparse regression using the Lasso and the tools are related to concentration of measure and convex analysis.


联系人:李松教授(songli@zju.edu.cn)


Date: 2019-10-23 Visitcount : 437