数学科学学院

数据科学| Stochastic Optimization for AUC Maximization in Machine Learning

来源:数学科学学院 发布时间:2019-12-25   403

时间:12月26日 16:00-17:00

报告人:Yiming Ying教授(纽约州立大学奥尔巴尼分校)

地点:玉泉工商楼200-9

Stochastic optimization algorithms such as stochastic gradient descent (SGD) update the model sequentially with cheap per-iteration costs, making them amenable for large-scale streaming data analysis. However, most of the existing studies focus on the classification accuracy which  can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking.
In this talk, I will present our recent work on developing novel SGD-type algorithms for AUC maximization. In particular, we establish min-max optimal rates for the existing AUC maximization algorithms using stability and generalization trade-off framework.   Then, we explore the problem structures to develop fast stochastic optimization algorithms for AUC maximization with least-square loss.  The main idea is to reformulate it as a stochastic saddle-point (min-max) problem. Furthermore, we extend the developed saddle-point formulation to the setting of general losses by Bernstein polynomial approximation, and  to the setting of deep neural network.  This talk is based on a series of joint work with Michael Natole, Neyo Yang, Wei Shen, Siwei Lyu, and Tianbao Yang.

联系人:郭正初(guozhengchu@zju.edu.cn

Copyright © 2023 浙江大学数学科学学院    版权所有

    浙ICP备05074421号

技术支持: 创高软件     管理登录

    您是第 1000 位访问者