Gradient method with multiple damping for large-scale unconstrained optimization
Gradient methods are popular due to the fact that only gradient of the objective function is required. On the other hand, the methods can be very slow if the objective function is very ill-conditioned. One possible reason for the inefficiency of the gradient methods is that a constant criterion, whi...
Main Authors: | Sim, Hong Seng, Leong, Wah June, Chen, Chuei Yee |
---|---|
Format: | Article |
Published: |
Springer
2019
|
Similar Items
-
Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
by: Hong, Seng Sim, et al.
Published: (2021) -
Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
by: Hong, Seng Sim, et al.
Published: (2018) -
A class of diagonally preconditioned limited memory BFGS method for large scale unconstrained optimization
by: Leong, Wah June, et al.
Published: (2009) -
Preconditioning on subspace quasi-Newton method for large scale unconstrained optimization
by: Sim, Hong Seng, et al.
Published: (2013) -
A new diagonal gradient-type method for large scale unconstrained optimization
by: Farid, Mahboubeh, et al.
Published: (2013)