Subgroup Preference Neural Network
Subgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (<i>SGPNN</i>) that combines multiple networks have different activation function, learning rate,...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-09-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/21/18/6104 |
_version_ | 1797517264837148672 |
---|---|
author | Ayman Elgharabawy Mukesh Prasad Chin-Teng Lin |
author_facet | Ayman Elgharabawy Mukesh Prasad Chin-Teng Lin |
author_sort | Ayman Elgharabawy |
collection | DOAJ |
description | Subgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (<i>SGPNN</i>) that combines multiple networks have different activation function, learning rate, and output layer into one artificial neural network (<i>ANN</i>) to discover the hidden relation between the subgroups’ multi-labels. The <i>SGPNN</i> is a feedforward (<i>FF</i>), partially connected network that has a single middle layer and uses stairstep (<i>SS</i>) multi-valued activation function to enhance the prediction’s probability and accelerate the ranking convergence. The novel structure of the proposed <i>SGPNN</i> consists of a multi-activation function neuron (<i>MAFN</i>) in the middle layer to rank each subgroup independently. The <i>SGPNN</i> uses gradient ascent to maximize the Spearman ranking correlation between the groups of labels. Each label is represented by an output neuron that has a single <i>SS</i> function. The proposed <i>SGPNN</i> using conjoint dataset outperforms the other label ranking methods which uses each dataset individually. The proposed <i>SGPNN</i> achieves an average accuracy of 91.4% using the conjoint dataset compared to supervised clustering, decision tree, multilayer perceptron label ranking and label ranking forests that achieve an average accuracy of 60%, 84.8%, 69.2% and 73%, respectively, using the individual dataset. |
first_indexed | 2024-03-10T07:13:21Z |
format | Article |
id | doaj.art-46e4ff63bd5e4817af8e87cbc0ebfd75 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T07:13:21Z |
publishDate | 2021-09-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-46e4ff63bd5e4817af8e87cbc0ebfd752023-11-22T15:11:42ZengMDPI AGSensors1424-82202021-09-012118610410.3390/s21186104Subgroup Preference Neural NetworkAyman Elgharabawy0Mukesh Prasad1Chin-Teng Lin2Australian Artificial Intelligence Institute, School of Computer Science, University of Technology Sydney, Ultimo, Sydney 2007, AustraliaAustralian Artificial Intelligence Institute, School of Computer Science, University of Technology Sydney, Ultimo, Sydney 2007, AustraliaAustralian Artificial Intelligence Institute, School of Computer Science, University of Technology Sydney, Ultimo, Sydney 2007, AustraliaSubgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (<i>SGPNN</i>) that combines multiple networks have different activation function, learning rate, and output layer into one artificial neural network (<i>ANN</i>) to discover the hidden relation between the subgroups’ multi-labels. The <i>SGPNN</i> is a feedforward (<i>FF</i>), partially connected network that has a single middle layer and uses stairstep (<i>SS</i>) multi-valued activation function to enhance the prediction’s probability and accelerate the ranking convergence. The novel structure of the proposed <i>SGPNN</i> consists of a multi-activation function neuron (<i>MAFN</i>) in the middle layer to rank each subgroup independently. The <i>SGPNN</i> uses gradient ascent to maximize the Spearman ranking correlation between the groups of labels. Each label is represented by an output neuron that has a single <i>SS</i> function. The proposed <i>SGPNN</i> using conjoint dataset outperforms the other label ranking methods which uses each dataset individually. The proposed <i>SGPNN</i> achieves an average accuracy of 91.4% using the conjoint dataset compared to supervised clustering, decision tree, multilayer perceptron label ranking and label ranking forests that achieve an average accuracy of 60%, 84.8%, 69.2% and 73%, respectively, using the individual dataset.https://www.mdpi.com/1424-8220/21/18/6104preference learningneural networklabel rankingstairstepspearman rank correlation |
spellingShingle | Ayman Elgharabawy Mukesh Prasad Chin-Teng Lin Subgroup Preference Neural Network Sensors preference learning neural network label ranking stairstep spearman rank correlation |
title | Subgroup Preference Neural Network |
title_full | Subgroup Preference Neural Network |
title_fullStr | Subgroup Preference Neural Network |
title_full_unstemmed | Subgroup Preference Neural Network |
title_short | Subgroup Preference Neural Network |
title_sort | subgroup preference neural network |
topic | preference learning neural network label ranking stairstep spearman rank correlation |
url | https://www.mdpi.com/1424-8220/21/18/6104 |
work_keys_str_mv | AT aymanelgharabawy subgrouppreferenceneuralnetwork AT mukeshprasad subgrouppreferenceneuralnetwork AT chintenglin subgrouppreferenceneuralnetwork |