MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification

In recent years, multi-classifier learning is of significant interest in industrial and economic fields. Moreover, neural network is a popular approach in multi-classifier learning. However, the accuracies of neural networks are often limited by their loss functions. For this reason, we design a nov...

Full description

Bibliographic Details
Main Authors: Yangfan Zhou, Xin Wang, Mingchuan Zhang, Junlong Zhu, Ruijuan Zheng, Qingtao Wu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8862886/
_version_ 1818877656214536192
author Yangfan Zhou
Xin Wang
Mingchuan Zhang
Junlong Zhu
Ruijuan Zheng
Qingtao Wu
author_facet Yangfan Zhou
Xin Wang
Mingchuan Zhang
Junlong Zhu
Ruijuan Zheng
Qingtao Wu
author_sort Yangfan Zhou
collection DOAJ
description In recent years, multi-classifier learning is of significant interest in industrial and economic fields. Moreover, neural network is a popular approach in multi-classifier learning. However, the accuracies of neural networks are often limited by their loss functions. For this reason, we design a novel cross entropy loss function, named MPCE, which based on the maximum probability in predictive results. In this paper, we first analyze the difference of gradients between MPCE and the cross entropy loss function. Then, we propose the gradient update algorithm based on MPCE. In the experimental part of this paper, we utilize four groups of experiments to verify the performance of the proposed algorithm on six public datasets. The first group of experimental results show that the proposed algorithm converge faster than the algorithms based on other loss functions. Moreover, the results of the second group show that the proposed algorithm obtains the highest training and test accuracy on the six datasets, and the proposed algorithm perform better than others when class number changing on the sensor dataset. Furthermore, we use the model of convolutional neural network to implement the compared methods on the mnist dataset in the fourth group of experiments. The results show that the proposed algorithm has the highest accuracy among all executed methods.
first_indexed 2024-12-19T14:01:45Z
format Article
id doaj.art-622028b4014b496bb513edbe27e4e6eb
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-19T14:01:45Z
publishDate 2019-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-622028b4014b496bb513edbe27e4e6eb2022-12-21T20:18:26ZengIEEEIEEE Access2169-35362019-01-01714633114634110.1109/ACCESS.2019.29462648862886MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network ClassificationYangfan Zhou0Xin Wang1Mingchuan Zhang2https://orcid.org/0000-0002-2523-1089Junlong Zhu3Ruijuan Zheng4https://orcid.org/0000-0002-0932-8788Qingtao Wu5College of Information Engineering, Henan University of Science and Technology, Luoyang, ChinaLaboratory of Applied Brain and Cognitive Sciences, Postdoctoral Research Station, School of Business and Management, Shanghai International Studies University, Shanghai, ChinaCollege of Information Engineering, Henan University of Science and Technology, Luoyang, ChinaCollege of Information Engineering, Henan University of Science and Technology, Luoyang, ChinaCollege of Information Engineering, Henan University of Science and Technology, Luoyang, ChinaCollege of Information Engineering, Henan University of Science and Technology, Luoyang, ChinaIn recent years, multi-classifier learning is of significant interest in industrial and economic fields. Moreover, neural network is a popular approach in multi-classifier learning. However, the accuracies of neural networks are often limited by their loss functions. For this reason, we design a novel cross entropy loss function, named MPCE, which based on the maximum probability in predictive results. In this paper, we first analyze the difference of gradients between MPCE and the cross entropy loss function. Then, we propose the gradient update algorithm based on MPCE. In the experimental part of this paper, we utilize four groups of experiments to verify the performance of the proposed algorithm on six public datasets. The first group of experimental results show that the proposed algorithm converge faster than the algorithms based on other loss functions. Moreover, the results of the second group show that the proposed algorithm obtains the highest training and test accuracy on the six datasets, and the proposed algorithm perform better than others when class number changing on the sensor dataset. Furthermore, we use the model of convolutional neural network to implement the compared methods on the mnist dataset in the fourth group of experiments. The results show that the proposed algorithm has the highest accuracy among all executed methods.https://ieeexplore.ieee.org/document/8862886/Cross entropyloss functionmaximum probabilityneural network classificationsoftmax
spellingShingle Yangfan Zhou
Xin Wang
Mingchuan Zhang
Junlong Zhu
Ruijuan Zheng
Qingtao Wu
MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification
IEEE Access
Cross entropy
loss function
maximum probability
neural network classification
softmax
title MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification
title_full MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification
title_fullStr MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification
title_full_unstemmed MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification
title_short MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification
title_sort mpce a maximum probability based cross entropy loss function for neural network classification
topic Cross entropy
loss function
maximum probability
neural network classification
softmax
url https://ieeexplore.ieee.org/document/8862886/
work_keys_str_mv AT yangfanzhou mpceamaximumprobabilitybasedcrossentropylossfunctionforneuralnetworkclassification
AT xinwang mpceamaximumprobabilitybasedcrossentropylossfunctionforneuralnetworkclassification
AT mingchuanzhang mpceamaximumprobabilitybasedcrossentropylossfunctionforneuralnetworkclassification
AT junlongzhu mpceamaximumprobabilitybasedcrossentropylossfunctionforneuralnetworkclassification
AT ruijuanzheng mpceamaximumprobabilitybasedcrossentropylossfunctionforneuralnetworkclassification
AT qingtaowu mpceamaximumprobabilitybasedcrossentropylossfunctionforneuralnetworkclassification