An improvement of stochastic gradient descent approach for mean-variance portfolio optimization problem

In this paper, the current variant technique of the stochastic gradient descent (SGD) approach, namely, the adaptive moment estimation (Adam) approach, is improved by adding the standard error in the updating rule. ,e aim is to fasten the convergence rate of the Adam algorithm. ,is improvement is...

Full description

Bibliographic Details
Main Authors: S. W. Su, Stephanie, Kek, Sie Long
Format: Article
Language:English
Published: 2021
Subjects:
Online Access:http://eprints.uthm.edu.my/2327/1/J12286_58525d433e35f3854a4226ebd4fc4e38.pdf
Description
Summary:In this paper, the current variant technique of the stochastic gradient descent (SGD) approach, namely, the adaptive moment estimation (Adam) approach, is improved by adding the standard error in the updating rule. ,e aim is to fasten the convergence rate of the Adam algorithm. ,is improvement is termed as Adam with standard error (AdamSE) algorithm. On the other hand, the mean-variance portfolio optimization model is formulated from the historical data of the rate of return of the S&P 500 stock, 10-year Treasury bond, and money market. ,e application of SGD, Adam, adaptive moment estimation with maximum (AdaMax), Nesterov-accelerated adaptive moment estimation (Nadam), AMSGrad, and AdamSE algorithms to solve the meanvariance portfolio optimization problem is further investigated. During the calculation procedure, the iterative solution converges to the optimal portfolio solution. It is noticed that the AdamSE algorithm has the smallest iteration number. ,e results show that the rate of convergence of the Adam algorithm is significantly enhanced by using the AdamSE algorithm. In conclusion, the efficiency of the improved Adam algorithm using the standard error has been expressed. Furthermore, the applicability of SGD, Adam, AdaMax, Nadam, AMSGrad, and AdamSE algorithms in solving the mean-variance portfolio optimization problem is validated.