数学科学学院

计算与应用讨论班——Evolution of Discriminator and Generator Gradients in GAN Training: From Fitting to Collapse

来源:数学科学学院 发布时间:2025-03-04   226

报告人:高卫国(复旦大学)


时间:2025年3月14日,14:45-15:45


地点:海纳苑2幢210


摘要:Generative Adversarial Networks (GANs) are powerful generative models but often suffer from mode mixture and mode collapse. We propose a perspective that views GAN training as a two-phase progression from fitting to collapse, where mode mixture and mode collapse are treated as inter-connected. Inspired by the particle model interpretation of GANs, we leverage the discriminator gradient to analyze particle movement and the generator gradient, specifically steepness,to quantify the severity of mode mixture by measuring the generators sensitivity to changes in the latent space. Using these theoretical insights into evolution of gradients, we design a specialized metric that integrates both gradients to detect the transition from fitting to collapse. This metric forms the basis of an early stopping algorithm, which stops training at a point that retains sample quality and diversity. Experiments on synthetic and real-world datasets, including MNIST, Fashion MNIST, and CIFAR-10, validate our theoretical findings and demonstrate the effectiveness of the proposed algorithm. This is joint work with Ming Li.


报告人简介:高卫国,复旦大学数学科学学院教授、博士生导师,复旦大学大数据学院副院长,全国应用统计专业学位研究生教育指导委员会委员,中国数学会计算数学分会副主任。研究领域为数值线性代数和高性能计算,包括线性与非线性特征值问题、大规模科学与并行计算、电子结构计算与鞍点计算、数据科学中的数值分析问题等。曾在SINUM, SISC, SIMAX, Numer Math, J Comp Phys等计算数学专业杂志和ACM TOMS, IEEE TAC, Int J Numer Meth Eng, JACS, J Chem Phys, Comput Phys Commun, Comp Mater Sci等应用领域的杂志上发表高质量论文。


Copyright © 2023 浙江大学数学科学学院    版权所有

    浙ICP备05074421号

技术支持: 寸草心科技     管理登录

    您是第 1000 位访问者