Cross Entropy in Deep Learning of Classifiers Is Unnecessary—ISBE Error Is All You Need

In deep learning of classifiers, the cost function usually takes the form of a combination of SoftMax and CrossEntropy functions. The SoftMax unit transforms the scores predicted by the model network into assessments of the degree (probabilities) of an object’s membership to a given class. On the ot...

Full description

Bibliographic Details
Main Author: Władysław Skarbek
Format: Article
Language:English
Published: MDPI AG 2024-01-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/26/1/65