Acceleration in first-order optimization methods: promenading beyond convexity or smoothness, and applications

<p>Acceleration in optimization is a term that is generally applied to optimization algorithms presenting some common methodology that enjoy convergence rates that improve over other more simple algorithms for the same problem. For example, Nesterov's Accelerated Gradient Descent impr...

全面介绍

书目详细资料
主要作者: Martinez Rubio, D
其他作者: Kanade, V
格式: Thesis
语言:English
出版: 2021
主题: