Separable Gaussian Neural Networks: Structure, Analysis, and Function Approximations
The Gaussian-radial-basis function neural network (GRBFNN) has been a popular choice for interpolation and classification. However, it is computationally intensive when the dimension of the input vector is high. To address this issue, we propose a new feedforward network-separable Gaussian neural ne...
Main Authors: | Siyuan Xing, Jian-Qiao Sun |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-09-01
|
Series: | Algorithms |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-4893/16/10/453 |
Similar Items
-
Arbitrarily Accurate Analytical Approximations for the Error Function
by: Roy M. Howard
Published: (2022-02-01) -
Explicitly Invertible Approximations of the Gaussian Q-Function: A Survey
by: Alessandro Soranzo, et al.
Published: (2023-01-01) -
Function approximation method based on weights gradient descent in reinforcement learning
by: Xiaoyan QIN, Yuhan LIU, Yunlong XU, Bin LI
Published: (2023-08-01) -
High-Accuracy Gaussian Function Generator for Neural Networks
by: Cosmin Radu Popa
Published: (2022-12-01) -
A Simplified Radial Basis Function Method with Exterior Fictitious Sources for Elliptic Boundary Value Problems
by: Chih-Yu Liu, et al.
Published: (2022-05-01)