https://mp.weixin.qq.com/s/NjHULEEUUggG0iGd7imaeg 周五讨论班更新
YMSC 清华大学丘成桐数学科学中心 2022-04-13 16:49
应用数学讨论班
An Algebraically Converging Stochastic Gradient Descent Algorithm for Global Optimization
杨雨楠
图片
图片
Organizer / 组织者
包承龙、史作强
图片
Speaker / 主讲人
杨雨楠(ETH Zurich)
图片
Time / 时间
Fri. 4:30-5:30 pm, 2022-4-15
图片
Venue / 地点
腾讯会议:513 143 699
Abstract
We propose a new stochastic gradient descent algorithm for finding the global optimizer of nonconvex optimization problems, referred to here as ``AdaVar''. A key component in the algorithm is the adaptive tuning of the randomness based on the value of the objective function. In the language of simulated annealing, the temperature is state-dependent. With this, we can prove global convergence with an algebraic rate both in probability and in the parameter space. This is a major improvement over the classical rate from using a simpler control of the noise term. The convergence proof is based on the actual discrete setup of the algorithm. We also present several numerical examples demonstrating the efficiency and robustness of the algorithm for global convergence.
Speaker
Yunan Yang is an applied mathematician working in inverse problems and optimal transport. Currently, Yunan is an advanced Fellow at the Institute for Theoretical Studies at ETH Zurich. She will be a Tenure-Track Assistant Professor in the Department of Mathematics at Cornell University starting in July 2023. Yunan Yang earned a Ph.D. degree in mathematics from the University of Texas at Austin in 2018, supervised by Prof. Bjorn Engquist. From September 2018 to August 2021, Yunan was a Courant Instructor at the Courant Institute of Mathematical Sciences, New York University.
--
FROM 211.161.249.*