Acceleration in first-order optimization methods: promenading beyond convexity or smoothness, and applications
<p>Acceleration in optimization is a term that is generally applied to optimization algorithms presenting some common methodology that enjoy convergence rates that improve over other more simple algorithms for the same problem. For example, Nesterov's Accelerated Gradient Descent impr...
主要作者: | |
---|---|
其他作者: | |
格式: | Thesis |
语言: | English |
出版: |
2021
|
主题: |