NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search

Recently, many-objective optimization problems (MaOPs) have become a hot issue of interest in academia and industry, and many more many-objective evolutionary algorithms (MaOEAs) have been proposed. NSGA-II/SDR (NSGA-II with a strengthened dominance relation) is an improved NSGA-II, created by repla...

Full description

Bibliographic Details
Main Authors: Yingxin Zhang, Gaige Wang, Hongmei Wang
Format: Article
Language:English
Published: MDPI AG 2023-04-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/11/8/1911
_version_ 1797604484341301248
author Yingxin Zhang
Gaige Wang
Hongmei Wang
author_facet Yingxin Zhang
Gaige Wang
Hongmei Wang
author_sort Yingxin Zhang
collection DOAJ
description Recently, many-objective optimization problems (MaOPs) have become a hot issue of interest in academia and industry, and many more many-objective evolutionary algorithms (MaOEAs) have been proposed. NSGA-II/SDR (NSGA-II with a strengthened dominance relation) is an improved NSGA-II, created by replacing the traditional Pareto dominance relation with a new dominance relation, termed SDR, which is better than the original algorithm in solving small-scale MaOPs with few decision variables, but performs poorly in large-scale MaOPs. To address these problems, we added the following improvements to the NSGA-II/SDR to obtain NSGA-II/SDR-OLS, which enables it to better achieve a balance between population convergence and diversity when solving large-scale MaOPs: (1) The opposition-based learning (OBL) strategy is introduced in the initial population initialization stage, and the final initial population is formed by the initial population and the opposition-based population, which optimizes the quality and convergence of the population; (2) the local search (LS) strategy is introduced to expand the diversity of populations by finding neighborhood solutions, in order to avoid solutions falling into local optima too early. NSGA-II/SDR-OLS is compared with the original algorithm on nine benchmark problems to verify the effectiveness of its improvement. Then, we compare our algorithm with six existing algorithms, which are promising region-based multi-objective evolutionary algorithms (PREA), a scalable small subpopulation-based covariance matrix adaptation evolution strategy (S3-CMA-ES), a decomposition-based multi-objective evolutionary algorithm guided by growing neural gas (DEA-GNG), a reference vector-guided evolutionary algorithm (RVEA), NSGA-II with conflict-based partitioning strategy (NSGA-II-conflict), and a genetic algorithm using reference-point-based non-dominated sorting (NSGA-III).The proposed algorithm has achieved the best results in the vast majority of test cases, indicating that our algorithm has strong competitiveness.
first_indexed 2024-03-11T04:47:04Z
format Article
id doaj.art-cbb8919f93034ac5925014c004f50b6a
institution Directory Open Access Journal
issn 2227-7390
language English
last_indexed 2024-03-11T04:47:04Z
publishDate 2023-04-01
publisher MDPI AG
record_format Article
series Mathematics
spelling doaj.art-cbb8919f93034ac5925014c004f50b6a2023-11-17T20:18:21ZengMDPI AGMathematics2227-73902023-04-01118191110.3390/math11081911NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local SearchYingxin Zhang0Gaige Wang1Hongmei Wang2School of Computer Science and Technology, Ocean University of China, Qingdao 266100, ChinaSchool of Computer Science and Technology, Ocean University of China, Qingdao 266100, ChinaInformation Engineering College, Xinjiang Institute of Engineering, Urumqi 830023, ChinaRecently, many-objective optimization problems (MaOPs) have become a hot issue of interest in academia and industry, and many more many-objective evolutionary algorithms (MaOEAs) have been proposed. NSGA-II/SDR (NSGA-II with a strengthened dominance relation) is an improved NSGA-II, created by replacing the traditional Pareto dominance relation with a new dominance relation, termed SDR, which is better than the original algorithm in solving small-scale MaOPs with few decision variables, but performs poorly in large-scale MaOPs. To address these problems, we added the following improvements to the NSGA-II/SDR to obtain NSGA-II/SDR-OLS, which enables it to better achieve a balance between population convergence and diversity when solving large-scale MaOPs: (1) The opposition-based learning (OBL) strategy is introduced in the initial population initialization stage, and the final initial population is formed by the initial population and the opposition-based population, which optimizes the quality and convergence of the population; (2) the local search (LS) strategy is introduced to expand the diversity of populations by finding neighborhood solutions, in order to avoid solutions falling into local optima too early. NSGA-II/SDR-OLS is compared with the original algorithm on nine benchmark problems to verify the effectiveness of its improvement. Then, we compare our algorithm with six existing algorithms, which are promising region-based multi-objective evolutionary algorithms (PREA), a scalable small subpopulation-based covariance matrix adaptation evolution strategy (S3-CMA-ES), a decomposition-based multi-objective evolutionary algorithm guided by growing neural gas (DEA-GNG), a reference vector-guided evolutionary algorithm (RVEA), NSGA-II with conflict-based partitioning strategy (NSGA-II-conflict), and a genetic algorithm using reference-point-based non-dominated sorting (NSGA-III).The proposed algorithm has achieved the best results in the vast majority of test cases, indicating that our algorithm has strong competitiveness.https://www.mdpi.com/2227-7390/11/8/1911evolutionary algorithmmany-objective optimizationlarge-scale optimizationopposition-based learninglocal search
spellingShingle Yingxin Zhang
Gaige Wang
Hongmei Wang
NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search
Mathematics
evolutionary algorithm
many-objective optimization
large-scale optimization
opposition-based learning
local search
title NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search
title_full NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search
title_fullStr NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search
title_full_unstemmed NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search
title_short NSGA-II/SDR-OLS: A Novel Large-Scale Many-Objective Optimization Method Using Opposition-Based Learning and Local Search
title_sort nsga ii sdr ols a novel large scale many objective optimization method using opposition based learning and local search
topic evolutionary algorithm
many-objective optimization
large-scale optimization
opposition-based learning
local search
url https://www.mdpi.com/2227-7390/11/8/1911
work_keys_str_mv AT yingxinzhang nsgaiisdrolsanovellargescalemanyobjectiveoptimizationmethodusingoppositionbasedlearningandlocalsearch
AT gaigewang nsgaiisdrolsanovellargescalemanyobjectiveoptimizationmethodusingoppositionbasedlearningandlocalsearch
AT hongmeiwang nsgaiisdrolsanovellargescalemanyobjectiveoptimizationmethodusingoppositionbasedlearningandlocalsearch