The Compact Support Neural Network
Neural networks are popular and useful in many fields, but they have the problem of giving high confidence responses for examples that are away from the training data. This makes the neural networks very confident in their prediction while making gross mistakes, thus limiting their reliability for s...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-12-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/21/24/8494 |
_version_ | 1797500680115585024 |
---|---|
author | Adrian Barbu Hongyu Mou |
author_facet | Adrian Barbu Hongyu Mou |
author_sort | Adrian Barbu |
collection | DOAJ |
description | Neural networks are popular and useful in many fields, but they have the problem of giving high confidence responses for examples that are away from the training data. This makes the neural networks very confident in their prediction while making gross mistakes, thus limiting their reliability for safety-critical applications such as autonomous driving and space exploration, etc. This paper introduces a novel neuron generalization that has the standard dot-product-based neuron and the radial basis function (RBF) neuron as two extreme cases of a shape parameter. Using a rectified linear unit (ReLU) as the activation function results in a novel neuron that has compact support, which means its output is zero outside a bounded domain. To address the difficulties in training the proposed neural network, it introduces a novel training method that takes a pretrained standard neural network that is fine-tuned while gradually increasing the shape parameter to the desired value. The theoretical findings of the paper are bound on the gradient of the proposed neuron and proof that a neural network with such neurons has the universal approximation property. This means that the network can approximate any continuous and integrable function with an arbitrary degree of accuracy. The experimental findings on standard benchmark datasets show that the proposed approach has smaller test errors than the state-of-the-art competing methods and outperforms the competing methods in detecting out-of-distribution samples on two out of three datasets. |
first_indexed | 2024-03-10T03:07:19Z |
format | Article |
id | doaj.art-fb943064815b463fb7a014822ff081c9 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T03:07:19Z |
publishDate | 2021-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-fb943064815b463fb7a014822ff081c92023-11-23T10:32:13ZengMDPI AGSensors1424-82202021-12-012124849410.3390/s21248494The Compact Support Neural NetworkAdrian Barbu0Hongyu Mou1Statistics Department, Florida State University, Tallahassee, FL 32306, USAStatistics Department, Florida State University, Tallahassee, FL 32306, USANeural networks are popular and useful in many fields, but they have the problem of giving high confidence responses for examples that are away from the training data. This makes the neural networks very confident in their prediction while making gross mistakes, thus limiting their reliability for safety-critical applications such as autonomous driving and space exploration, etc. This paper introduces a novel neuron generalization that has the standard dot-product-based neuron and the radial basis function (RBF) neuron as two extreme cases of a shape parameter. Using a rectified linear unit (ReLU) as the activation function results in a novel neuron that has compact support, which means its output is zero outside a bounded domain. To address the difficulties in training the proposed neural network, it introduces a novel training method that takes a pretrained standard neural network that is fine-tuned while gradually increasing the shape parameter to the desired value. The theoretical findings of the paper are bound on the gradient of the proposed neuron and proof that a neural network with such neurons has the universal approximation property. This means that the network can approximate any continuous and integrable function with an arbitrary degree of accuracy. The experimental findings on standard benchmark datasets show that the proposed approach has smaller test errors than the state-of-the-art competing methods and outperforms the competing methods in detecting out-of-distribution samples on two out of three datasets.https://www.mdpi.com/1424-8220/21/24/8494neural networksRBF networksOOD detectionuniversal approximation |
spellingShingle | Adrian Barbu Hongyu Mou The Compact Support Neural Network Sensors neural networks RBF networks OOD detection universal approximation |
title | The Compact Support Neural Network |
title_full | The Compact Support Neural Network |
title_fullStr | The Compact Support Neural Network |
title_full_unstemmed | The Compact Support Neural Network |
title_short | The Compact Support Neural Network |
title_sort | compact support neural network |
topic | neural networks RBF networks OOD detection universal approximation |
url | https://www.mdpi.com/1424-8220/21/24/8494 |
work_keys_str_mv | AT adrianbarbu thecompactsupportneuralnetwork AT hongyumou thecompactsupportneuralnetwork AT adrianbarbu compactsupportneuralnetwork AT hongyumou compactsupportneuralnetwork |